Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
The ids.json
file is passed to custom AI labeling blocks. It lists the data sample IDs to operate on as integers.
{
"ids": [ <id>, <id>, ..., <id> ]
}
{
"ids": [ 1440653288, 1440653283, ..., 1440653252 ]
}
As you develop with Edge Impulse, you may come across specific files. For example, when creating custom blocks. The following documentation provides the file structure and examples for these files.
The train_input.json
file is passed to custom learning blocks as the value for the --input-file
argument. It contains configuration information for model training options that you may want to use within your training script, if applicable. The specification is shown below.
type CreateKerasTrainModelOptions = {
classes: string[],
mode: 'classification' | 'regression' | 'object-detection' | 'visual-anomaly' | 'anomaly-gmm';
printHWInfo: boolean | undefined,
inputShapeString: string,
yType: 'npy' | 'structured';
trainTestSplit: number,
stratifiedTrainTest: boolean,
onlineDspConfig: OnlineDspConfig | undefined;
convertInt8: boolean,
profileInt8: boolean,
skipEmbeddingsAndMemory: boolean,
objectDetectionLastLayer: 'mobilenet-ssd' | 'fomo' | 'yolov2-akida' | 'yolov5' | 'yolov5v5-drpai' |
'yolox' | 'yolov7' | 'tao-retinanet' | 'tao-ssd' | 'tao-yolov3' | 'tao-yolov4' | undefined;
objectDetectionAugmentation: boolean | undefined,
objectDetectionBatchSize: number | undefined,
syntiantTarget?: boolean,
maxTrainingTimeSeconds: number,
remainingGpuComputeTimeSeconds: number,
isEnterpriseProject: boolean,
flattenDataset: boolean,
akidaModel: boolean,
akidaEdgeModel: boolean,
skipMemoryProfiling: boolean,
tensorboardLogging: boolean,
customValidationSplit: boolean,
validationMetadataKey?: string,
customVariantsToProfile: CustomVariantInferenceJobModelVariant[] | undefined;
};
{
"classes": [
"idle",
"snake",
"updown",
"wave"
],
"mode": "classification",
"printHWInfo": false,
"inputShapeString": "(39,)",
"yType": "npy",
"trainTestSplit": 0.2,
"stratifiedTrainTest": false,
"convertInt8": true,
"profileInt8": true,
"skipEmbeddingsAndMemory": false,
"objectDetectionAugmentation": false,
"syntiantTarget": false,
"maxTrainingTimeSeconds": 604800,
"remainingGpuComputeTimeSeconds": null,
"isEnterpriseProject": true,
"flattenDataset": false,
"akidaModel": false,
"akidaEdgeModel": false,
"skipMemoryProfiling": false,
"tensorboardLogging": false,
"customValidationSplit": false
}
The deployment-metadata.json
file is passed to custom deployment blocks. It provides details about the impulse being deployed.
interface DeploymentMetadataV1 {
version: 1;
// Global deployment counter
deployCounter: number;
// The output classes (for classification)
classes: string[];
// The number of samples to be taken per inference (e.g. 100Hz data, 3 axis, 2 seconds => 200)
samplesPerInference: number;
// Number of axes ((e.g. 100Hz data, 3 axis, 2 seconds => 3)
axesCount: number;
// Frequency of the data
frequency: number;
// TFLite models (already converted and quantized)
tfliteModels: {
// Information about the model type, e.g. quantization parameters
details: KerasModelIODetails;
// Name of the input tensor
inputTensor: string | undefined;
// Name of the output tensor
outputTensor: string | undefined;
// Path of the model on disk
modelPath: string;
// Path of the model on disk (ONNX), not always available
onnxModelPath: string | undefined;
// Path of a secondary/auxiliary model on disk (ONNX), not always available
onnxAuxModelPath: string | undefined;
// Path to .prototxt (in case of YOLOX), not always available
prototxtPath: string | undefined;
// Path to .fbz (BrainChip Akida model file), not always available
akidaModelPath: string | undefined;
// Path to .fbz (BrainChip Akida model prepared for Edge Learning), not always available
akidaEdgeLearningModelPath: string | undefined;
// Calculated arena size when running TFLite in interpreter mode
arenaSize: number;
// Number of values to be passed into the model
inputFrameSize: number;
}[];
// Project information
project: {
// Project name
name: string;
// Project ID
id: number;
// Project owner (user or organization name)
owner: string;
// API key, only set for deploy blocks with privileged flag and development keys set
apiKey: string | undefined;
// Studio host
studioHost: string;
};
// Impulse information
impulse: DeploymentMetadataImpulse;
// Sensor guess based on the input
sensor: 'camera' | 'microphone' | 'accelerometer' | 'positional' | 'environmental' | 'fusion' | undefined;
// Folder locations
folders: {
// Input files are here, the input folder contains 'edge-impulse-sdk', 'model-parameters', 'tflite-model'
input: string;
// Write your output file here
output: string;
};
}
/**
* Fields common to all CreateImpulseStateX
*/
interface CreateImpulseStateBase extends CreateImpulseStateMetadata {
id: number;
name: string;
title: string;
type: string;
}
interface CreateImpulseStateDsp extends CreateImpulseStateBase {
type: string | 'custom';
implementationVersion: number;
axes: string[];
customUrl?: string;
input: number;
tunerBaseBlockId?: number;
autotune?: boolean;
organization?: {
id: number;
dspId: number;
};
namedAxes?: CreateImpulseStateDspNamedAxis[];
}
type CreateImpulseStateDspNamedAxis = {
name: string,
description?: string,
required?: boolean,
selectedAxis?: string,
};
type CreateImpulseStateInput = CreateImpulseStateInputTimeSeries |
CreateImpulseStateInputImage |
CreateImpulseStateInputFeatures;
interface CreateImpulseStateInputFeatures extends CreateImpulseStateBase {
type: 'features';
datasetSubset?: {
subsetModulo: number;
subsetSeed: number;
};
}
interface CreateImpulseStateInputImage extends CreateImpulseStateBase {
type: 'image';
imageWidth: number;
imageHeight: number;
resizeMode: 'squash' | 'fit-short' | 'fit-long' | 'crop';
cropAnchor: 'top-left' | 'top-center' | 'top-right' | 'middle-left' | 'middle-center' | 'middle-right' | 'bottom-left' | 'bottom-center' | 'bottom-right';
resizeMethod: 'nearest' | 'lanczos3';
labelingMethod?: 'object_detection' | 'single_label';
datasetSubset?: {
subsetModulo: number;
subsetSeed: number;
};
}
interface CreateImpulseStateInputTimeSeries extends CreateImpulseStateBase {
type: 'time-series';
windowSizeMs: number;
windowIncreaseMs: number;
frequencyHz: number;
classificationWindowIncreaseMs?: number;
padZeros: boolean;
datasetSubset?: {
subsetModulo: number;
subsetSeed: number;
};
}
interface CreateImpulseStateLearning extends CreateImpulseStateBase {
dsp: number[];
type: typeof ALL_CREATE_IMPULSE_STATE_LEARNING_TYPES[number];
}
const ALL_CREATE_IMPULSE_STATE_LEARNING_TYPES = [
'keras',
'keras-transfer-image',
'keras-transfer-kws',
'keras-object-detection',
'keras-regression',
'anomaly',
'keras-akida',
'keras-akida-transfer-image',
'keras-akida-object-detection',
'anomaly-gmm',
'keras-visual-anomaly',
];
/**
* Provides metadata shared between all block types
*/
interface CreateImpulseStateMetadata {
/**
* Metadata for block versioning
*/
// The user-editable description of this block version
description?: string;
// Which part of the system created this version (createImpulse | clone | tuner)
createdBy?: string;
// The date and time this version was created
createdAt?: Date;
// Tuner template block id. This is _always_ -1 if the block is a Tuner block.
// the only place where this is used is in the DB to query for Tuner-managed blocks
// in the block config table.
tunerTemplateId?: number;
// If this is true, this block is also a tuner block.
db?: boolean;
}
interface DeploymentMetadataImpulse {
inputBlocks: CreateImpulseStateInput[];
dspBlocks: (CreateImpulseStateDsp & { metadata: DSPFeatureMetadata | undefined })[];
learnBlocks: CreateImpulseStateLearning[];
}
interface DSPConfig {
options: {[s: string]: string | number | boolean | null;};
performance: { latency: number, ram: number } | undefined;
calculateFeatureImportance: boolean;
// Currently only used by EON tuner to identify blocks with the feature explorer
// skipped.
skipFeatureExplorer?: boolean;
}
interface DSPFeatureMetadata {
created: Date;
dspConfig: DSPConfig;
labels: string[]; // the training labels
featureLabels: string[] | undefined;
featureCount: number;
valuesPerAxis: number;
windowCount: number;
windowSizeMs: number;
windowIncreaseMs: number;
padZeros: boolean;
frequency: number;
outputConfig: DSPFeatureMetadataOutput;
performance: { latency: number, ram: number } | undefined;
fftUsed: number[] | undefined;
includeEmptyLabels: boolean;
inputShape: number[] | undefined;
includedSamplesAreInOrder: boolean;
resamplingAlgorithmVersion: number | undefined;
resizingAlgorithmVersion: number | undefined;
}
type DSPFeatureMetadataOutput = {
type: 'image',
shape: { width: number, height: number, channels: number, frames?: number },
axes?: number
} | {
type: 'spectrogram',
shape: { width: number, height: number },
axes?: number
} | {
type: 'flat',
shape: { width: number },
axes?: number
};
/**
* Information required to process a model's input and output data
*/
interface KerasModelIODetails {
modelType: 'int8' | 'float32' | 'akida' | 'requiresRetrain';
inputs: KerasModelTensorDetails[];
outputs: KerasModelTensorDetails[];
}
/**
* Information necessary to quantize or dequantize the contents of a tensor
*/
type KerasModelTensorDetails = {
dataType: 'float32';
// These are added since TF2.7 - older models don't have them
name?: string;
shape?: number[];
} | {
dataType: 'int8' | 'uint8';
// These are added since TF2.7 - older models don't have them
name?: string;
shape?: number[];
// Scale and zero point are used only for quantized tensors
quantizationScale?: number;
quantizationZeroPoint?: number;
};
{
"version": 1,
"samplesPerInference": 125,
"axesCount": 3,
"classes": [
"idle",
"snake",
"updown",
"wave"
],
"deployCounter": 83,
"folders": {
"input": "/home/input",
"output": "/home/output"
},
"frequency": 62.5,
"impulse": {
"inputBlocks": [
{
"id": 2,
"type": "time-series",
"name": "Time series",
"title": "Time series data",
"windowSizeMs": 2000,
"windowIncreaseMs": 240,
"frequencyHz": 62.5,
"padZeros": false,
"primaryVersion": true,
"db": false
}
],
"dspBlocks": [
{
"id": 24,
"type": "spectral-analysis",
"name": "Spectral features",
"axes": [
"accX",
"accY",
"accZ"
],
"title": "Spectral Analysis",
"input": 2,
"primaryVersion": true,
"createdBy": "createImpulse",
"createdAt": "2022-08-07T07:39:37.055Z",
"implementationVersion": 2,
"db": false,
"metadata": {
"created": "2023-08-29T01:32:50.434Z",
"dspConfig": {
"options": {
"scale-axes": 1,
"filter-cutoff": 8,
"filter-order": 6,
"fft-length": 64,
"spectral-peaks-count": 3,
"spectral-peaks-threshold": 0.1,
"spectral-power-edges": "0.1, 0.5, 1.0, 2.0, 5.0",
"do-log": true,
"do-fft-overlap": true,
"wavelet-level": 1,
"extra-low-freq": false,
"input-decimation-ratio": "1",
"filter-type": "low",
"analysis-type": "FFT",
"wavelet": "db4"
},
"performance": {
"latency": 4,
"ram": 2144
},
"calculateFeatureImportance": false
},
"labels": [
"idle",
"snake",
"updown",
"wave"
],
"featureLabels": [
"accX RMS",
"accX Skewness",
"accX Kurtosis",
"accX Spectral Power 0.49 - 1.46 Hz",
"accX Spectral Power 1.46 - 2.44 Hz",
"accX Spectral Power 2.44 - 3.42 Hz",
"accX Spectral Power 3.42 - 4.39 Hz",
"accX Spectral Power 4.39 - 5.37 Hz",
"accX Spectral Power 5.37 - 6.35 Hz",
"accX Spectral Power 6.35 - 7.32 Hz",
"accX Spectral Power 7.32 - 8.3 Hz",
"accY RMS",
"accY Skewness",
"accY Kurtosis",
"accY Spectral Power 0.49 - 1.46 Hz",
"accY Spectral Power 1.46 - 2.44 Hz",
"accY Spectral Power 2.44 - 3.42 Hz",
"accY Spectral Power 3.42 - 4.39 Hz",
"accY Spectral Power 4.39 - 5.37 Hz",
"accY Spectral Power 5.37 - 6.35 Hz",
"accY Spectral Power 6.35 - 7.32 Hz",
"accY Spectral Power 7.32 - 8.3 Hz",
"accZ RMS",
"accZ Skewness",
"accZ Kurtosis",
"accZ Spectral Power 0.49 - 1.46 Hz",
"accZ Spectral Power 1.46 - 2.44 Hz",
"accZ Spectral Power 2.44 - 3.42 Hz",
"accZ Spectral Power 3.42 - 4.39 Hz",
"accZ Spectral Power 4.39 - 5.37 Hz",
"accZ Spectral Power 5.37 - 6.35 Hz",
"accZ Spectral Power 6.35 - 7.32 Hz",
"accZ Spectral Power 7.32 - 8.3 Hz"
],
"valuesPerAxis": 11,
"windowCount": 2554,
"featureCount": 33,
"windowSizeMs": 2000,
"windowIncreaseMs": 240,
"frequency": 62.5,
"padZeros": false,
"outputConfig": {
"type": "flat",
"shape": {
"width": 33
}
},
"performance": {
"latency": 4,
"ram": 2144
},
"fftUsed": [
64
],
"includeEmptyLabels": false,
"inputShape": [
375
],
"includedSamplesAreInOrder": true
}
}
],
"learnBlocks": [
{
"id": 7,
"type": "keras",
"name": "NN Classifier",
"dsp": [
24
],
"title": "Neural Network (Keras)",
"primaryVersion": true,
"db": false
},
{
"id": 30,
"type": "anomaly",
"name": "Anomaly detection",
"dsp": [
24
],
"title": "Anomaly Detection (K-means)",
"primaryVersion": true,
"createdBy": "createImpulse",
"createdAt": "2023-08-29T01:40:50.747Z",
"db": false
}
]
},
"project": {
"name": "Tutorial: Continuous motion recognition",
"id": 276194,
"owner": "Edge Impulse Docs",
"studioHost": "studio.edgeimpulse.com"
},
"sensor": "accelerometer",
"tfliteModels": [
{
"arenaSize": 2982,
"inputFrameSize": 33,
"inputTensor": "dense_input",
"outputTensor": "y_pred/Softmax:0",
"details": {
"modelType": "int8",
"inputs": [
{
"dataType": "int8",
"name": "serving_default_x:0",
"shape": [
1,
33
],
"quantizationScale": 0.10049157589673996,
"quantizationZeroPoint": -70
}
],
"outputs": [
{
"dataType": "int8",
"name": "StatefulPartitionedCall:0",
"shape": [
1,
4
],
"quantizationScale": 0.00390625,
"quantizationZeroPoint": -128
}
]
},
"modelPath": "/home/input/trained.tflite"
}
]
}
The parameters.json
file is included at the root of the directory of a custom block. It is used to describe the block itself and identify the parameters available for its configuration. The parameters defined in this file are the input options rendered for the block in Studio and passed into the block as arguments when the it is run.
The file can be considered in two sections: a header section and a parameters section. The header section identifies the block type and its associated metadata. The metadata required varies by block type. This information is followed by an array of parameters items.
Custom parameters are not available for deployment blocks
type AIActionBlockParametersJson = {
version: 1,
type: 'ai-action',
info: {
name: string,
description: string,
requiredEnvVariables: string[] | undefined;
operatesOn: ['images_object_detection' | 'images_single_label' | 'audio' | 'other'] | undefined;
},
parameters: DSPParameterItem[];
};
type DeployBlockParametersJson = {
version: 1,
type: 'deploy',
info: {
name: string,
description: string,
category?: 'library' | 'firmware';
integrateUrl?: string,
cliArguments: string,
supportsEonCompiler: boolean,
mountLearnBlock: boolean,
showOptimizations: boolean,
privileged?: boolean,
},
};
type MachineLearningBlockParametersJson = {
version: 1,
type: 'machine-learning',
info: {
name: string,
description: string,
operatesOn?: 'object_detection' | 'audio' | 'image' | 'regression' | 'other';
objectDetectionLastLayer?: 'mobilenet-ssd' | 'fomo' | 'yolov2-akida' | 'yolov5' | 'yolov5v5-drpai' | 'yolox' | 'yolov7' | 'tao-retinanet' | 'tao-ssd' | 'tao-yolov3' | 'tao-yolov4';
imageInputScaling?: '0..1' | '-1..1' | '-128..127' | '0..255' | 'torch' | 'bgr-subtract-imagenet-mean';
indRequiresGpu?: boolean,
repositoryUrl?: string,
customModelVariants?: {
'key': string,
'name': string,
'inferencingEntrypoint': string,
'profilingEntrypoint'?: string,
'modelFiles'?: {
'id': string,
'name': string,
'type': 'binary' | 'json' | 'text';
'description': string,
}[],
}[],
displayCategory?: 'classical' | 'tao';
},
parameters: DSPParameterItem[];
};
type DSPBlockParametersJson = {
version: 1,
type: 'dsp',
info: {
type: string,
title: string,
author: string,
description: string,
name: string,
preferConvolution: boolean,
convolutionColumns?: 'axes' | string;
convolutionKernelSize?: number,
cppType: string,
visualization: 'dimensionalityReduction' | undefined;
experimental: boolean,
hasTfliteImplementation: boolean,
latestImplementationVersion: number,
hasImplementationVersion: boolean,
hasFeatureImportance: boolean,
hasAutoTune?: boolean,
minimumVersionForAutotune?: number,
usesState?: boolean,
axes: {
name: string,
description: string,
optional?: boolean,
}[] | undefined;
port?: number,
}
parameters: {
group: string,
items: DSPParameterItem[];
}[],
};
type SyntheticDataBlockParametersJson = {
version: 1,
type: 'synthetic-data',
info: {
name: string,
description: string,
requiredEnvVariables: string[] | undefined;
},
parameters: DSPParameterItem[];
};
type TransformBlockParametersJson = {
version: 1,
type: 'transform',
info: {
name: string,
description: string,
operatesOn: 'file' | 'directory' | 'standalone' | undefined;
transformMountpoints: {
bucketId: number,
mountPoint: string,
}[] | undefined;
indMetadata: boolean | undefined;
cliArguments: string | undefined;
allowExtraCliArguments: boolean | undefined;
showInDataSources: boolean | undefined;
showInCreateTransformationJob: boolean | undefined;
requiredEnvVariables: string[] | undefined;
},
parameters: DSPParameterItem[];
};
type DSPParameterItem = {
// Rendered as the label
name: string,
// Default value
value: string | number | boolean;
// Type of UI element to render
type: 'string' | 'int' | 'float' | 'select' | 'boolean' | 'bucket' | 'dataset' | 'flag' | 'secret';
// Optional help text (rendered as a help icon, text is shown on hover)
help?: string,
// Parameter that maps back to your block (no spaces allowed)
param: string,
// When type is "select" lists all options for the dropdown menu
// you can either pass in an array of strings, or a list of objects
// (if you want to customize the label)
valid?: (string | { label: string, value: string })[];
// If this is set, the field is rendered as readonly with the text "Click to set"
// when clicked the UI changes to a normal text box.
optional?: boolean,
// Whether the field should be rendered as readonly.
// These fields are shown, but cannot be changed.
readonly?: boolean,
// If set, this item is only shown if the implementation version of the block matches
// (only for processing blocks)
showForImplementationVersion: number[] | undefined;
// Show/hide the item depending on another parameter
showIf: ({
parameter: string,
operator: 'eq' | 'neq',
value: string,
}) | undefined;
// Processing blocks only. If set, a macro is created like:
// #define EI_DSP_PARAMS_BLOCKCPPTYPE_PARAM VALUE
createMacro?: boolean,
// When type is "select" the value passed into your block will be a string,
// you can use configType to override the type (used during deployment only)
configType?: string,
// (Optional) UX section to show parameter in.
section?: 'advanced' | 'modelProfiling';
// Only valid for type "string". If set to true, renders a multi-line text area.
multiline?: boolean,
// If set, shows a hint about the input format below the input. Use this
// sparingly, as it clutters the UI.
hint?: string,
// Sets the placeholder text on the input element (for types "string", "int", "float" and "secret")
placeholder?: string,
};
Below you will find full examples of parameter files for the various types of blocks.
{
"version": 1,
"type": "ai-action",
"info": {
"name": "Bounding box labeling with OWL-ViT",
"description": "Zero-shot object detector to automatically label objects using bounding boxes with OWL-ViT. To detect more complex objects you can combine this block with 'Bounding box re-labeling with GPT-4o'. First, roughly find objects using this block, then re-label (or remove) bounding boxes using the GPT4o block.",
"requiredEnvVariables": [
"BEAM_ENDPOINT",
"BEAM_ACCESS_KEY"
],
"operatesOn": [
"images_object_detection"
]
},
"parameters": [
{
"name": "Prompt",
"value": "A person (person, 0.2)",
"type": "string",
"help": "A prompt specifying the images to label. Separate multiple objects with a newline. You can specify the label and the min. confidence rating in the parenthesis.",
"param": "prompt",
"multiline": true,
"placeholder": "A prompt specifying the images to label. Separate multiple objects with a newline. You can specify the label and the min. confidence rating in the parenthesis.",
"hint": "Separate multiple objects with a newline. You can specify the label and the min. confidence rating in the parenthesis (e.g. 'A person (person, 0.2)')."
},
{
"name": "Delete existing bounding boxes",
"value": "no",
"type": "select",
"valid": [
{ "label": "No", "value": "no" },
{ "label": "Only if they match any labels in the prompt", "value": "matching-prompt" },
{ "label": "Yes", "value": "yes" }
],
"param": "delete_existing_bounding_boxes"
},
{
"name": "Ignore objects smaller than (%)",
"optional": true,
"value": 0,
"type": "float",
"param": "ignore-objects-smaller-than",
"help": "Any objects where the area is smaller than X% of the whole image will be ignored"
},
{
"name": "Ignore objects larger than (%)",
"optional": true,
"value": 100,
"type": "float",
"param": "ignore-objects-larger-than",
"help": "Any objects where the area is larger than X% of the whole image will be ignored"
},
{
"name": "Non-max suppression",
"help": "Deduplicate boxes via non-max suppression (NMS)",
"value": true,
"type": "flag",
"param": "nms"
},
{
"name": "NMS IoU threshold",
"help": "Threshold for non-max suppression",
"value": 0.2,
"type": "float",
"param": "nms-iou-threshold",
"showIf": {
"parameter": "nms",
"operator": "eq",
"value": "true"
}
}
]
}
{
"version": 1,
"type": "deploy",
"info": {
"name": "Build Linux app",
"description": "An example custom deployment block to build a standalone Linux application",
"category": "firmware",
"mountLearnBlock": false,
"supportsEonCompiler": true,
"showOptimizations": true,
"cliArguments": "",
"privileged": false,
"integrateUrl": "https://docs.edgeimpulse.com/docs"
}
}
{
"version": 1,
"type": "machine-learning",
"info": {
"name": "Keras multi-layer perceptron",
"description": "Demonstration of a simple Keras custom learn block with CUDA drivers that can run on both CPU and GPU.",
"indRequiresGpu": false,
"operatesOn": "other",
"repositoryUrl": "https://github.com/edgeimpulse/example-custom-ml-block-keras"
},
"parameters": [
{
"name": "Number of training cycles",
"value": 30,
"type": "int",
"help": "Number of epochs to train the neural network on.",
"param": "epochs"
},
{
"name": "Learning rate",
"value": 0.001,
"type": "float",
"help": "How fast the neural network learns, if the network overfits quickly, then lower the learning rate.",
"param": "learning-rate"
}
]
}
{
"version": 1,
"info": {
"title": "Custom processing block example",
"author": "Test User",
"description": "An example of a custom processing block.",
"name": "Custom block",
"cppType": "custom_block",
"preferConvolution": false,
"visualization": "dimensionalityReduction",
"experimental": false,
"latestImplementationVersion": 1
},
"parameters": [
{
"group": "Scaling",
"items": [
{
"name": "Scale axes",
"value": 1,
"type": "float",
"help": "Multiplies axes by this number",
"param": "scale-axes"
}
]
}
]
}
{
"version": 1,
"type": "synthetic-data",
"info": {
"name": "Whisper voice synthesis",
"description": "An example synthetic data block that uses Whisper to generate audio keyword data."
},
"parameters": [
{
"name": "OpenAI API Key",
"value": "",
"type": "secret",
"help": "An API Key that gives access to OpenAI",
"param": "OPENAI_API_KEY"
},
{
"name": "Phrase",
"value": "Edge Impulse",
"type": "string",
"help": "Phrase for which to generate voice samples",
"param": "phrase"
},
{
"name": "Label",
"value": "edge_impulse",
"type": "string",
"help": "Samples will be added to Edge Impulse with this label",
"param": "label"
},
{
"name": "Number of samples",
"value": 3,
"type": "int",
"help": "Number of unique samples to generate",
"param": "samples"
},
{
"name": "Voice",
"value": "random",
"type": "select",
"valid": [ "random", "alloy", "echo", "fable", "onyx", "nova", "shimmer" ],
"help": "Voice to use for speech generation",
"param": "voice",
"optional": true
},
{
"name": "Model",
"value": "tts-1",
"type": "select",
"valid": [ "tts-1", "tts-1-hd" ],
"help": "Model to use for speech generation",
"param": "model",
"optional": true
},
{
"name": "Speed",
"value": "0.6, 0.7, 0.8, 0.9, 1, 1.1, 1.2",
"type": "string",
"help": "A list of possible speed of the generated audio. Select values between '0.25' and '4.0'. A random one will be picked for each sample.",
"param": "speed"
},
{
"name": "Minimum length (seconds)",
"value": 1,
"type": "float",
"help": "Minimum length of generated audio samples. Audio samples will be padded with silence to minimum length",
"param": "min-length"
},
{
"name": "Upload to category",
"value": "split",
"type": "select",
"valid": [
{ "label": "Split 80/20 between training and testing", "value": "split" },
{ "label": "Training", "value": "training" },
{ "label": "Testing", "value": "testing" }
],
"help": "Data will be uploaded to this category in your project",
"param": "upload-category"
}
]
}
{
"version": 1,
"type": "transform",
"info": {
"name": "Mix background noise",
"description": "An example transformation block that mixes background noise into audio samples.",
"operatesOn": "file",
"transformMountpoints": [
{
"bucketId": 5532,
"mountPoint": "/mnt/s3fs/edge-impulse-demo-bucket"
}
]
},
"parameters": [
{
"name": "Number of files to create",
"type": "int",
"value": 10,
"help": "How many new files to create per input file. Noise is randomly mixed in per file.",
"param": "out-count"
},
{
"name": "Frequency",
"value": 16000,
"type": "int",
"param": "frequency",
"help": "Output frequency of the WAV files"
}
]
}
Parameter items are defined as JSON objects that contain a type
property. For example:
{
"name": "Scale axes",
"value": 1.0,
"type": "float",
"help": "Multiplies axes by this number.",
"param": "scale-axes"
}
The parameter type options available are shown in the table below, along with how the parameter is rendered in Studio and how it will be passed to your custom block. In general, parameter items are passed as command line arguments to your custom block script.
Checkbox
--<param-name> 1
(true) | --<param-name> 0
(false)
Dropdown
--<param-name> "<bucket-name>"
Dropdown
--<param-name> "<dataset-name>"
Checkbox
--<param-name>
(true) | (false)
Text box
--<param-name> <value>
Text box
--<param-name> <value>
Text box
<param-name>
(environment variable)
Dropdown
--<param-name> <value>
Text box
--<param-name> "<value>"
Processing blocks do not receive command line arguments
Instead of command line arguments being passed to the block as shown above, processing blocks receive an HTTP request with the parameters in the request body, which are subsequently passed to the function generating the features in your processing block. In this case, dashes in parameter names are replaced with underscores before being passed to your function as arguments:
A processing block parameter named custom-processing-param
is passed to your feature generation function as custom_processing_param
.
Secrets are passed as environment variables instead of command line arguments
{
"name": "Boolean example",
"value": true,
"type": "boolean",
"help": "An example boolean parameter type to show how it is rendered.",
"param": "do-boolean-action"
}
--do-boolean-action 1
Only available for AI labeling, synthetic data, and transformation blocks
{
"name": "Bucket example",
"value": "",
"type": "bucket",
"help": "An example bucket parameter type to show how it is rendered.",
"param": "bucket-example-param"
}
--bucket-example-param "edge-impulse-customers-demo-team"
Only available for AI labeling, synthetic data, and transformation blocks
{
"name": "Dataset example",
"value": "",
"type": "dataset",
"help": "An example flag parameter type to show how it is rendered.",
"param": "dataset-example-param"
}
--dataset-example-param "Gestures"
{
"name": "Flag example",
"value": true,
"type": "flag",
"help": "An example flag parameter type to show how it is rendered.",
"param": "do-flag-action"
}
--do-flag-action
{
"name": "Float example",
"value": 0.1,
"type": "float",
"help": "An example float parameter type to show how it is rendered.",
"param": "float-example-param"
}
--float-example-param 0.1
{
"name": "Int example",
"value": 1,
"type": "int",
"help": "An example int parameter type to show how it is rendered.",
"param": "int-example-param"
}
--int-example-param 1
Only available for AI labeling, synthetic data, and transformation blocks
{
"name": "Secret example",
"value": "",
"type": "secret",
"help": "An example secret parameter type to show how it is rendered.",
"param": "SECRET_EXAMPLE_PARAM"
}
SECRET_EXAMPLE_PARAM
{
"name": "Select example 1",
"value": "1",
"type": "select",
"help": "An example select parameter type to show how it is rendered.",
"param": "select-example-param-1",
"valid": [ "1", "3", "10", "30", "100","1000" ]
}
--select-example-param-1 "1"
{
"name": "Select example 2",
"value": "1",
"type": "select",
"help": "An example select parameter type to show how it is rendered.",
"param": "select-example-param-2",
"valid": [
{ "label": "One", "value": "1" },
{ "label": "Three", "value": "3" },
{ "label": "Ten", "value": "10" },
{ "label": "Thirty", "value": "30" },
{ "label": "One hundred", "value": "100" },
{ "label": "One thousand", "value": "1000"}
]
}
--select-example-param-2 "1"
{
"name": "String example",
"value": "An example string",
"type": "string",
"help": "An example string parameter type to show how it is rendered.",
"param": "string-example-param"
}
--string-example-param "An example string"
Only available for processing blocks
Processing block parameters can contain multiple groups to better organize the options when rendered in Studio. Each string entered as value for the group
property is rendered has a header element.
"parameters": [
{
"group": "Example parameter group 1",
"items": [
{
"name": "Boolean example",
"value": false,
"type": "boolean",
"help": "An example boolean parameter type to show how it is rendered.",
"param": "do-boolean-action"
},
{
"name": "Flag example",
"value": false,
"type": "flag",
"help": "An example flag parameter type to show how it is rendered.",
"param": "do-flag-action"
}
]
},
{
"group": "Example parameter group 2",
"items": [
{
"name": "Float example",
"value": 1.0,
"type": "float",
"help": "An example float parameter type to show how it is rendered.",
"param": "float-example-param"
}
]
}
]
Parameters can be conditionally shown based on the value of another parameter using the showIf
property.
{
"name": "Boolean example",
"value": false,
"type": "boolean",
"help": "An example boolean parameter type to show how it is rendered.",
"param": "do-boolean-action"
},
{
"name": "Int example",
"value": 1,
"type": "int",
"help": "An example int parameter type to show how it is rendered.",
"param": "int-example-param",
"showIf": {
"parameter": "do-boolean-action",
"operator": "eq",
"value": "true"
}
},
{
"name": "Float example",
"value": 1.0,
"type": "float",
"help": "An example float parameter type to show how it is rendered.",
"param": "float-example-param"
}
Only available for processing blocks
Processing blocks can have different versions, which allows you to add new functionality to existing blocks without breaking earlier implementations. You are able to shown/hide parameters based on the implementation version set in the latestImplementationVersion
property of the processing block.
A processing block set to version 4:
"info": {
"title": "Spectral Analysis",
...
"latestImplementationVersion": 4
}
A parameter shown only for implementation versions 3 and 4:
{
"name": "Type",
"value": "FFT",
"help": "Type of spectral analysis to apply",
"type": "select",
"valid": [ "FFT", "Wavelet" ],
"param": "analysis-type",
"showForImplementationVersion": [ 3, 4 ]
}
The sample_id_details.json
is a file that lives alongside the training (train) and validation (test) dataset splits that are used for training a learning block. The information contained in this file may be relevant to you when developing .
The file contains a list of samples IDs as integers in row order for each dataset. It allows you to map processed features that are passed to your learning block back to the original sample they came from. Note that the same sample ID may appear several times; sample IDs are repeated when there is more than one window created for the sample.
The ei-metadata.json
file can be used to update metadata for a clinical dataset data item after being processed by a .
add
: the new metadata key-value pairs will be added to the existing metadata.
replace
: the existing metadata will be deleted and the new metadata key-value pairs will be added.
replace-checks
: the existing metadata keys that begin with ei_check
will be deleted and the new metadata key-value pairs will be added.
data/
├── X_split_test.npy
├── X_split_train.npy
├── Y_split_test.npy
├── Y_split_train.npy
└── sample_id_details.json
{
"train": [
<id>,
<id>,
...
<id>
],
"validation": [
<id>,
<id>,
...
<id>
]
}
{
"train": [
1440653288,
1440653283,
...
1440653252
],
"validation": [
1440653307,
1440653297,
...
1440653285
]
}
interface TransformBlockMetadataFile {
version: 1;
action: 'replace' | 'add' | 'replace-checks';
metadata: { [ k: string ]: string };
}
{
'version': 1,
'action': 'add',
'metadata': {
'now': 1734721115539
}
}