CNum 0.2.1
CPU-optimized ML library for C++
Loading...
Searching...
No Matches
CNum::Deploy::InferenceAPI< ModelType, Storage > Class Template Reference

A REST API for making predictions with CNum models. More...

#include <Deploy.h>

Public Member Functions

 InferenceAPI (::std::string path, PreprocessFunction preprocess, PostprocessFunction postprocess, ::std::string allowed_origins="*", size_t n_models=20, unsigned short port=18080)
 Constructor.
template<PathString Path>
constexpr void add_inference_route (crow::HTTPMethod method, InferenceRouteFunction route)
 Add a route to the API that uses a model.
template<PathString Path>
constexpr void add_regular_route (crow::HTTPMethod method, RegularRouteFunction route)
 Add a route that doesn't use a model.
void start ()
 Start the backend.

Detailed Description

template<typename ModelType, typename Storage>
class CNum::Deploy::InferenceAPI< ModelType, Storage >

A REST API for making predictions with CNum models.

Template Parameters
ModelTypeThe type of CNum model to use for the model pool
StorageA struct containing all of the data from the preprocessing that also needs to be used in the postprocessing

Constructor & Destructor Documentation

◆ InferenceAPI()

template<typename ModelType, typename Storage>
CNum::Deploy::InferenceAPI< ModelType, Storage >::InferenceAPI ( ::std::string path,
PreprocessFunction preprocess,
PostprocessFunction postprocess,
::std::string allowed_origins = "*",
size_t n_models = 20,
unsigned short port = 18080 )

Constructor.

Parameters
pathThe path to the trained CNum model (".cmod")
preprocessThe function used to preprocess the data received in the request to the '/predict' route
postprocessThe function used process the predictions of the model into what is returned in the response
allowed_originsThe allowed origins for the API CORS header
n_modelsThe number of model instances in the ModelPool
portThe port to which the API listens

Member Function Documentation

◆ add_inference_route()

template<typename ModelType, typename Storage>
template<PathString Path>
void CNum::Deploy::InferenceAPI< ModelType, Storage >::add_inference_route ( crow::HTTPMethod method,
InferenceRouteFunction route )
constexpr

Add a route to the API that uses a model.

Template Parameters
PathThe path for the new route
Parameters
methodThe HTTP method of the route (See Crow C++ documentation)
routeThe actual middleware

◆ add_regular_route()

template<typename ModelType, typename Storage>
template<PathString Path>
void CNum::Deploy::InferenceAPI< ModelType, Storage >::add_regular_route ( crow::HTTPMethod method,
RegularRouteFunction route )
constexpr

Add a route that doesn't use a model.

Template Parameters
PathThe path for the new route
Parameters
methodThe HTTP method of the route (See Crow C++ documentation)
routeThe actual middleware

◆ start()

template<typename ModelType, typename Storage>
void CNum::Deploy::InferenceAPI< ModelType, Storage >::start ( )

Start the backend.


The documentation for this class was generated from the following file: