Deploy pretrained model

Deploy pretrained model

post

Takes in a TFLite file and builds the model and SDK. Updates are streamed over the websocket API (or can be retrieved through the /stdout endpoint). Use getProfileTfliteJobResult to get the results when the job is completed.

Authorizations
Path parameters
projectIdintegerRequired

Project ID

Query parameters
impulseIdintegerOptional

Impulse ID. If this is unset then the default impulse is used.

Body
modelFileBase64stringRequired

A base64 encoded pretrained model

modelFileTypestring · enumRequiredPossible values:
deploymentTypestringRequired

The name of the built target. You can find this by listing all deployment targets through listDeploymentTargetsForProject (via GET /v1/api/{projectId}/deployment/targets) and see the format type.

enginestring · enumOptionalPossible values:
representativeFeaturesBase64stringOptional

A base64 encoded .npy file containing the features from your validation set (optional for onnx and saved_model) - used to quantize your model.

deployModelTypestring · enumOptionalPossible values:
useConverterstring · enumOptional

Optional, use a specific converter (only for ONNX models).

Possible values:
Responses
200
OK
application/json
Responseall of
post
POST /v1/api/{projectId}/jobs/deploy-pretrained-model HTTP/1.1
Host: studio.edgeimpulse.com
x-api-key: YOUR_API_KEY
Content-Type: application/json
Accept: */*
Content-Length: 323

{
  "modelFileBase64": "text",
  "modelFileType": "tflite",
  "deploymentType": "text",
  "engine": "tflite",
  "modelInfo": {
    "input": {
      "inputType": "time-series",
      "frequencyHz": 1,
      "windowLengthMs": 1
    },
    "model": {
      "modelType": "classification",
      "labels": [
        "text"
      ]
    }
  },
  "representativeFeaturesBase64": "text",
  "deployModelType": "int8",
  "useConverter": "onnx-tf"
}
200

OK

{
  "success": true,
  "error": "text",
  "id": 12873488112
}

Last updated

Was this helpful?