LogoLogo
HomeDocsAPIProjectsForum
  • Getting Started
    • For beginners
    • For ML practitioners
    • For embedded engineers
  • Frequently asked questions
  • Tutorials
    • End-to-end tutorials
      • Continuous motion recognition
      • Responding to your voice
      • Recognize sounds from audio
      • Adding sight to your sensors
        • Collecting image data from the Studio
        • Collecting image data with your mobile phone
        • Collecting image data with the OpenMV Cam H7 Plus
      • Object detection
        • Detect objects using MobileNet SSD
        • Detect objects with FOMO
      • Sensor fusion
      • Sensor fusion using Embeddings
      • Processing PPG input with HR/HRV Features Block
      • Industrial Anomaly Detection on Arduino® Opta® PLC
    • Advanced inferencing
      • Continuous audio sampling
      • Multi-impulse
      • Count objects using FOMO
    • API examples
      • Running jobs using the API
      • Python API Bindings Example
      • Customize the EON Tuner
      • Ingest multi-labeled data using the API
      • Trigger connected board data sampling
    • ML & data engineering
      • EI Python SDK
        • Using the Edge Impulse Python SDK with TensorFlow and Keras
        • Using the Edge Impulse Python SDK to run EON Tuner
        • Using the Edge Impulse Python SDK with Hugging Face
        • Using the Edge Impulse Python SDK with Weights & Biases
        • Using the Edge Impulse Python SDK with SageMaker Studio
        • Using the Edge Impulse Python SDK to upload and download data
      • Label image data using GPT-4o
      • Label audio data using your existing models
      • Generate synthetic datasets
        • Generate image datasets using Dall·E
        • Generate keyword spotting datasets
        • Generate physics simulation datasets
        • Generate audio datasets using Eleven Labs
      • FOMO self-attention
    • Lifecycle Management
      • CI/CD with GitHub Actions
      • OTA Model Updates
        • with Nordic Thingy53 and the Edge Impulse APP
      • Data Aquisition from S3 Object Store - Golioth on AI
    • Expert network projects
  • Edge Impulse Studio
    • Organization hub
      • Users
      • Data campaigns
      • Data
      • Data transformation
      • Upload portals
      • Custom blocks
        • Transformation blocks
        • Deployment blocks
          • Deployment metadata spec
      • Health Reference Design
        • Synchronizing clinical data with a bucket
        • Validating clinical data
        • Querying clinical data
        • Transforming clinical data
        • Buildling data pipelines
    • Project dashboard
      • Select AI Hardware
    • Devices
    • Data acquisition
      • Uploader
      • Data explorer
      • Data sources
      • Synthetic data
      • Labeling queue
      • AI labeling
      • CSV Wizard (Time-series)
      • Multi-label (Time-series)
      • Tabular data (Pre-processed & Non-time-series)
      • Metadata
      • Auto-labeler [Deprecated]
    • Impulse design & Experiments
    • Bring your own model (BYOM)
    • Processing blocks
      • Raw data
      • Flatten
      • Image
      • Spectral features
      • Spectrogram
      • Audio MFE
      • Audio MFCC
      • Audio Syntiant
      • IMU Syntiant
      • HR/HRV features
      • Building custom processing blocks
        • Hosting custom DSP blocks
      • Feature explorer
    • Learning blocks
      • Classification (Keras)
      • Anomaly detection (K-means)
      • Anomaly detection (GMM)
      • Visual anomaly detection (FOMO-AD)
      • Regression (Keras)
      • Transfer learning (Images)
      • Transfer learning (Keyword Spotting)
      • Object detection (Images)
        • MobileNetV2 SSD FPN
        • FOMO: Object detection for constrained devices
      • NVIDIA TAO (Object detection & Images)
      • Classical ML
      • Community learn blocks
      • Expert Mode
      • Custom learning blocks
    • EON Tuner
      • Search space
    • Retrain model
    • Live classification
    • Model testing
    • Performance calibration
    • Deployment
      • EON Compiler
      • Custom deployment blocks
    • Versioning
  • Tools
    • API and SDK references
    • Edge Impulse CLI
      • Installation
      • Serial daemon
      • Uploader
      • Data forwarder
      • Impulse runner
      • Blocks
      • Himax flash tool
    • Edge Impulse for Linux
      • Linux Node.js SDK
      • Linux Go SDK
      • Linux C++ SDK
      • Linux Python SDK
      • Flex delegates
    • Edge Impulse Python SDK
  • Run inference
    • C++ library
      • As a generic C++ library
      • On your desktop computer
      • On your Zephyr-based Nordic Semiconductor development board
    • Linux EIM Executable
    • WebAssembly
      • Through WebAssembly (Node.js)
      • Through WebAssembly (browser)
    • Docker container
    • Edge Impulse firmwares
  • Edge AI Hardware
    • Overview
    • MCU
      • Nordic Semi nRF52840 DK
      • Nordic Semi nRF5340 DK
      • Nordic Semi nRF9160 DK
      • Nordic Semi nRF9161 DK
      • Nordic Semi nRF9151 DK
      • Nordic Semi nRF7002 DK
      • Nordic Semi Thingy:53
      • Nordic Semi Thingy:91
    • CPU
      • macOS
      • Linux x86_64
    • Mobile Phone
    • Porting Guide
  • Integrations
    • Arduino Machine Learning Tools
    • NVIDIA Omniverse
    • Embedded IDEs - Open-CMSIS
    • Scailable
    • Weights & Biases
  • Pre-built datasets
    • Continuous gestures
    • Running faucet
    • Keyword spotting
    • LiteRT (Tensorflow Lite) reference models
  • Tips & Tricks
    • Increasing model performance
    • Data augmentation
    • Inference performance metrics
    • Optimize compute time
    • Adding parameters to custom blocks
    • Combine Impulses
  • Concepts
    • Glossary
    • Data Engineering
      • Audio Feature Extraction
      • Motion Feature Extraction
    • ML Concepts
      • Neural Networks
        • Layers
        • Activation Functions
        • Loss Functions
        • Optimizers
          • Learned Optimizer (VeLO)
        • Epochs
      • Evaluation Metrics
    • Edge AI
      • Introduction to edge AI
      • What is edge computing?
      • What is machine learning (ML)?
      • What is edge AI?
      • How to choose an edge AI device
      • Edge AI lifecycle
      • What is edge MLOps?
      • What is Edge Impulse?
      • Case study: Izoelektro smart grid monitoring
      • Test and certification
    • What is embedded ML, anyway?
    • What is edge machine learning (edge ML)?
Powered by GitBook
On this page
  • Defining parameters
  • parameters.json file
  • All types
  • Basic logic for parameters
  • Show parameters depending on implementation version (DSP Blocks)
  • Full spec
  1. Tips & Tricks

Adding parameters to custom blocks

PreviousOptimize compute timeNextCombine Impulses

Last updated 6 months ago

Custom , and blocks can have custom parameters. You use these to give the user an option to visually configure the block. For example to select the cutoff for a filter in a DSP block, or what data augmentation to apply in an ML algorithm.

Defining parameters

Each parameter is defined as a JSON object. Here's a basic example:

{
    "name": "Scale axes",
    "value": 1,
    "type": "float",
    "help": "Multiplies axes by this number",
    "param": "scale-axes"
}

This renders the following UI element:

Here "name" maps to the label of the element, "help" is shown under a tooltip, "value" is the default value of the element, and "type" is enforced if you enter an invalid value.

"param" is the parameter that'll be sent to your block. Depending on the type of block this is either:

  • All other blocks: Passed in as command line arguments, prefixed with --. E.g. for the example above your block will be called with --scale-axes 1.

parameters.json file

Parameters are described in a parameters.json file placed in the root of your block. This is then automatically picked up by the Studio when rendering your block.

Example parameters.json file for DSP blocks

For DSP blocks the parameters.json file also contains metadata for the block, and the parameters are listed in the parameters object. Additionally, DSP blocks can contain multiple groups. Each group has a header element that is rendered.

{
    "version": 1,
    "type": "dsp",
    "info": {
        "title": "My awesome DSP Block",
        "author": "Jan Jongboom",
        "description": "My first DSP Block.",
        "name": "First DSP Block",
        "cppType": "my_awesome_dsp_block",
        "latestImplementationVersion": 1,
        "visualization": "dimensionalityReduction",
        "preferConvolution": true
    },
    "parameters": [
        {
            "group": "Group 1",
            "items": [
                {
                    "name": "Scale axes",
                    "value": 1,
                    "type": "float",
                    "help": "Multiplies axes by this number",
                    "param": "scale-axes"
                }
            ]
        }
    ]
}

This is rendered as:

Example parameters.json file for Learn blocks

For learn blocks the parameters.json file looks very similar to the DSP block, but does not support parameter groups: the parameters field is just an array.

{
    "version": 1,
    "type": "machine-learning",
    "info": {
        "name": "My custom ML model",
        "description": "Full ML pipeline implemented in PyTorch",
        "operatesOn": "other",
        "indRequiresGpu": false,
        "repositoryUrl": "https://github.com/edgeimpulse/my-test-repo"
    },
    "parameters": [{
        "name": "Learning rate #1",
        "value": 0.01,
        "type": "float",
        "help": "Learning rate for the first 10 epochs",
        "param": "learning-rate-1"
    }, {
        "name": "Learning rate #2",
        "value": 0.001,
        "type": "float",
        "help": "Fine-tuning learning rate",
        "param": "learning-rate-2"
    }]
}

This is rendered as:

And passed into your script as:

--learning-rate-1 0.01 --learning-rate-2 0.001

If a learn block has no parameters (either no parameters.json file or one with an empty array) then the 'Number of training cycles' (maps to --epochs) and 'Learning rate' (maps to --learning-rate) elements are rendered.

Example parameters.json file for Transformation blocks

For transformation blocks the parameters.json file looks very similar to the DSP block, but does not support parameter groups: the parameters field is just an array of items:

{
    "version": 1,
    "type": "transform",
    "info": {
        "name": "AMS Activity Sorter",
        "description": "Take data from the internal data lake, and an upload portal and mix them together.",
        "operatesOn": "standalone",
        "transformMountpoints": []
    },
    "parameters": [{
        "name": "Bucket",
        "type": "bucket",
        "param": "bucket-name",
        "value": "",
        "help": "The bucket where you're hosting all data"
    },
    {
        "name": "Bucket prefix",
        "value": "my-test-prefix/",
        "type": "string",
        "param": "bucket-prefix",
        "help": "The prefix in the bucket, where you're hosting the data"
    }]
}

This is rendered as:

And passed into your script as:

--bucket-name "ei-data-dev" --bucket-prefix "my-test-prefix/"

Transformation blocks have two extra types: dataset and bucket. Both render a dropdown menu listing all datasets or all configured cloud buckets in your organization. The name of the dataset/bucket will be passed in to your script (e.g. in the configuration above ei-data-dev is passed in).

Example parameters.json file for Synthetic data blocks

Synthetic data blocks follow the same pattern as transformation blocks.

All types

The supported properties for the type attribute are:

string, int, float

These all render a text box.

select

This renders a dropdown menu and needs to be paired with a valid parameter listing all options. E.g.:

{
    "name": "Input decimation ratio",
    "value": "1",
    "type": "select",
    "valid": [ "1", "3", "10", "30", "100", "1000" ],
    "param": "input-decimation-ratio"
}

Renders as:

And:

To use a different label and value for the select element you can pass in an array of objects to valid. E.g.:

"valid": [
    { "label": "gpu (NDP101)", "value": "gpu" },
    { "label": "log-bin (NDP120/200)", "value": "log-bin" }
]

Here the UI renders the label ("gpu (NDP101)" and "log-bin (NDP120/200)") but the values ("gpu" or "log-bin") are actually passed to your blocks.

boolean

This renders a checkbox. E.g.:

{
    "name": "Take log of spectrum?",
    "value": true,
    "type": "boolean",
    "param": "do-log"
}

Renders as:

Boolean values are passed into your transformation and learn blocks with either the value 0 (false) or 1 (true):

--do-log 1

If you don't want a value (so just --do-log passed into the script) then use a flag (see the next section).

flag

A flag renders exactly the same as a boolean, but for transformation and learn blocks the parameter is passed to your script differently.

If the value is true then the parameter is just passed in as an argument, e.g.:

--do-log

If the value is false then the parameter is omitted.

dataset (transformation blocks only)

This renders a dropdown menu with all your organizational datasets. E.g.:

{
    "name": "Dataset to refresh",
    "type": "dataset",
    "param": "dataset",
    "value": ""
}

Renders as:

The name of the dataset is passed into your script.

bucket (transformation blocks only)

This renders a dropdown menu with all your organizational buckets. E.g.:{

{
    "name": "Bucket prefix",
    "value": "my-test-prefix/",
    "type": "string",
    "param": "bucket-prefix",
    "help": "The prefix in the bucket, where you're hosting the data"
}

Renders as:

The name of the bucket is passed into your script.

Basic logic for parameters

Parameters can be shown depending on the state of other parameters using the showIf property. E.g.:

[
    {
        "name": "Apply augmentation",
        "value": false,
        "type": "boolean",
        "param": "augmentation"
    },
    {
        "name": "How much augmentation",
        "value": "A little",
        "type": "select",
        "valid": [
            "A little",
            "A lot"
        ],
        "param": "how-much-augmentation",
        "showIf": {
            "parameter": "augmentation",
            "operator": "eq",
            "value": "true"
        }
    }
]

Here the 'How much augmentation' property is shown only when the 'Apply augmentation' checkbox is checked.

Default state:

State when the checkbox is checked:

You can also use neq as the operator to show elements when a clause is not true. Other operators are currently not supported.

Show parameters depending on implementation version (DSP Blocks)

DSP Blocks can have multiple implementation versions. This is used heavily in our internal code to add new functionality to existing blocks without breaking earlier implementations. You set the latest implementation version in the info object of the parameters.json file (new blocks are created with this implementation version). E.g.:

{
    "info": {
        "title": "Spectral Analysis",
        ...
        "latestImplementationVersion": 4
    },

Then to show certain parameters only for certain implementation versions, you use the showForImplementationVersion property:

{
    "name": "Type",
    "value": "FFT",
    "help": "Type of spectral analysis to apply",
    "type": "select",
    "valid": [ "FFT", "Wavelet" ],
    "param": "analysis-type",
    "showForImplementationVersion": [ 3, 4 ]
}

Full spec

type DSPParameterItem = {
    // Rendered as the label
    name: string;
    // Default value
    value: string | number | boolean;
    // Type of UI element to render
    type: 'string' | 'int' | 'float' | 'select' | 'boolean' | 'bucket' | 'dataset' | 'flag';
    // Optional help text (rendered as a help icon, text is shown on hover)
    help?: string;
    // Parameter that maps back to your block (no spaces allowed)
    param: string;
    // When type is "select" lists all options for the dropdown menu
    // you can either pass in an array of strings, or a list of objects
    // (if you want to customize the label)
    valid?: string[] | ({ label: string, value: string }[]);
    // If this is set, the field is rendered as readonly with the text "Click to set"
    // when clicked the UI changes to a normal text box.
    optional?: boolean;
    // Whether the field should be rendered as readonly.
    // These fields are shown, but cannot be changed.
    readonly?: boolean;
    // If set, this item is only shown if the implementation version of the block matches
    // (only for DSP blocks)
    showForImplementationVersion: number[] | undefined;
    // Show/hide the item depending on another parameter
    showIf: ({
        parameter: string,
        operator: 'eq' | 'neq',
        value: string,
    }) | undefined;
    // DSP only. If set, a macro is created like:
    // #define EI_DSP_PARAMS_BLOCKCPPTYPE_PARAM     VALUE
    createMacro?: boolean;
    // When type is "select" the value passed into your block will be a string,
    // you can use configType to override the type (used during deployment only)
    configType?: string;
};

DSP Blocks: Passed in an HTTP request to the block, and then (by default) passed as an argument to the Python function implementing the block. See .

Now the FFT parameter is only shown for implementation versions 3 and 4. You can find complete examples of this behavior in the DSP block.

spectral analysis
DSP
Learn
Transformation
Custom DSP parameters
A simple parameter
Validating inputs is done automatically based on the "type" parameter
DSP Block with a group and one custom parameter
Learn block with two parameters
Transformation block with two parameters
A "select" element
A "select" element with an expanded dropdown menu
A "boolean" element
A "dataset" element
bucket" element
Building custom processing blocks > Adding configuration options