LogoLogo
HomeDocsAPIProjectsForum
  • Getting Started
    • For beginners
    • For ML practitioners
    • For embedded engineers
  • Frequently asked questions
  • Tutorials
    • End-to-end tutorials
      • Continuous motion recognition
      • Responding to your voice
      • Recognize sounds from audio
      • Adding sight to your sensors
        • Collecting image data from the Studio
        • Collecting image data with your mobile phone
        • Collecting image data with the OpenMV Cam H7 Plus
      • Object detection
        • Detect objects using MobileNet SSD
        • Detect objects with FOMO
      • Sensor fusion
      • Sensor fusion using Embeddings
      • Processing PPG input with HR/HRV Features Block
      • Industrial Anomaly Detection on Arduino® Opta® PLC
    • Advanced inferencing
      • Continuous audio sampling
      • Multi-impulse
      • Count objects using FOMO
    • API examples
      • Running jobs using the API
      • Python API Bindings Example
      • Customize the EON Tuner
      • Ingest multi-labeled data using the API
      • Trigger connected board data sampling
    • ML & data engineering
      • EI Python SDK
        • Using the Edge Impulse Python SDK with TensorFlow and Keras
        • Using the Edge Impulse Python SDK to run EON Tuner
        • Using the Edge Impulse Python SDK with Hugging Face
        • Using the Edge Impulse Python SDK with Weights & Biases
        • Using the Edge Impulse Python SDK with SageMaker Studio
        • Using the Edge Impulse Python SDK to upload and download data
      • Label image data using GPT-4o
      • Label audio data using your existing models
      • Generate synthetic datasets
        • Generate image datasets using Dall·E
        • Generate keyword spotting datasets
        • Generate physics simulation datasets
        • Generate audio datasets using Eleven Labs
      • FOMO self-attention
    • Lifecycle Management
      • CI/CD with GitHub Actions
      • OTA Model Updates
        • with Nordic Thingy53 and the Edge Impulse APP
      • Data Aquisition from S3 Object Store - Golioth on AI
    • Expert network projects
  • Edge Impulse Studio
    • Organization hub
      • Users
      • Data campaigns
      • Data
      • Data transformation
      • Upload portals
      • Custom blocks
        • Transformation blocks
        • Deployment blocks
          • Deployment metadata spec
      • Health Reference Design
        • Synchronizing clinical data with a bucket
        • Validating clinical data
        • Querying clinical data
        • Transforming clinical data
        • Buildling data pipelines
    • Project dashboard
      • Select AI Hardware
    • Devices
    • Data acquisition
      • Uploader
      • Data explorer
      • Data sources
      • Synthetic data
      • Labeling queue
      • AI labeling
      • CSV Wizard (Time-series)
      • Multi-label (Time-series)
      • Tabular data (Pre-processed & Non-time-series)
      • Metadata
      • Auto-labeler [Deprecated]
    • Impulse design & Experiments
    • Bring your own model (BYOM)
    • Processing blocks
      • Raw data
      • Flatten
      • Image
      • Spectral features
      • Spectrogram
      • Audio MFE
      • Audio MFCC
      • Audio Syntiant
      • IMU Syntiant
      • HR/HRV features
      • Building custom processing blocks
        • Hosting custom DSP blocks
      • Feature explorer
    • Learning blocks
      • Classification (Keras)
      • Anomaly detection (K-means)
      • Anomaly detection (GMM)
      • Visual anomaly detection (FOMO-AD)
      • Regression (Keras)
      • Transfer learning (Images)
      • Transfer learning (Keyword Spotting)
      • Object detection (Images)
        • MobileNetV2 SSD FPN
        • FOMO: Object detection for constrained devices
      • NVIDIA TAO (Object detection & Images)
      • Classical ML
      • Community learn blocks
      • Expert Mode
      • Custom learning blocks
    • EON Tuner
      • Search space
    • Retrain model
    • Live classification
    • Model testing
    • Performance calibration
    • Deployment
      • EON Compiler
      • Custom deployment blocks
    • Versioning
  • Tools
    • API and SDK references
    • Edge Impulse CLI
      • Installation
      • Serial daemon
      • Uploader
      • Data forwarder
      • Impulse runner
      • Blocks
      • Himax flash tool
    • Edge Impulse for Linux
      • Linux Node.js SDK
      • Linux Go SDK
      • Linux C++ SDK
      • Linux Python SDK
      • Flex delegates
    • Edge Impulse Python SDK
  • Run inference
    • C++ library
      • As a generic C++ library
      • On your desktop computer
      • On your Zephyr-based Nordic Semiconductor development board
    • Linux EIM Executable
    • WebAssembly
      • Through WebAssembly (Node.js)
      • Through WebAssembly (browser)
    • Docker container
    • Edge Impulse firmwares
  • Edge AI Hardware
    • Overview
    • MCU
      • Nordic Semi nRF52840 DK
      • Nordic Semi nRF5340 DK
      • Nordic Semi nRF9160 DK
      • Nordic Semi nRF9161 DK
      • Nordic Semi nRF9151 DK
      • Nordic Semi nRF7002 DK
      • Nordic Semi Thingy:53
      • Nordic Semi Thingy:91
    • CPU
      • macOS
      • Linux x86_64
    • Mobile Phone
    • Porting Guide
  • Integrations
    • Arduino Machine Learning Tools
    • NVIDIA Omniverse
    • Embedded IDEs - Open-CMSIS
    • Scailable
    • Weights & Biases
  • Pre-built datasets
    • Continuous gestures
    • Running faucet
    • Keyword spotting
    • LiteRT (Tensorflow Lite) reference models
  • Tips & Tricks
    • Increasing model performance
    • Data augmentation
    • Inference performance metrics
    • Optimize compute time
    • Adding parameters to custom blocks
    • Combine Impulses
  • Concepts
    • Glossary
    • Data Engineering
      • Audio Feature Extraction
      • Motion Feature Extraction
    • ML Concepts
      • Neural Networks
        • Layers
        • Activation Functions
        • Loss Functions
        • Optimizers
          • Learned Optimizer (VeLO)
        • Epochs
      • Evaluation Metrics
    • Edge AI
      • Introduction to edge AI
      • What is edge computing?
      • What is machine learning (ML)?
      • What is edge AI?
      • How to choose an edge AI device
      • Edge AI lifecycle
      • What is edge MLOps?
      • What is Edge Impulse?
      • Case study: Izoelektro smart grid monitoring
      • Test and certification
    • What is embedded ML, anyway?
    • What is edge machine learning (edge ML)?
Powered by GitBook
On this page
  • Prerequisites
  • Cloning the base repository
  • Deploying your impulse
  • Running the impulse
  • Seeing the output
  1. Run inference
  2. C++ library

On your Zephyr-based Nordic Semiconductor development board

PreviousOn your desktop computerNextLinux EIM Executable

Last updated 6 months ago

Impulses can be deployed as a C++ library. This packages all your signal processing blocks, configuration and learning blocks up into a single package. You can include this package in your own application to run the impulse locally. In this tutorial you'll export an impulse, and build a Zephyr RTOS application for the nRF52840 DK / nRF5340 DK / nRF9160DK / Thingy:91 development board to classify sensor data.

A working Zephyr RTOS build environment is required

This tutorial assumes that you're already familiar with building applications for the or other Zephyr RTOS supported board, and that you have your environment set up to compile applications for this platform. For this tutorial, you can use the or higher.

Prerequisites

Make sure you followed the tutorial, and have a trained impulse. Also, make sure you have a working Zephyr build environment, including the following tools:

  • Either the which includes Zephyr and all its dependencies (v1.6.0 or higher), or a .

  • The .

  • Optional: The and . These command line tools are required if you use the to upload firmware to your target board.

Cloning the base repository

We created an example repository which contains a small application that compliments the tutorial. This application can take raw, hard-coded inputs as an argument, and print out the final classification to the serial port so it can be read from your development computer. You can either or import the repository using Git:

git clone https://github.com/edgeimpulse/example-standalone-inferencing-zephyr.git

Fully featured open source repos are also available

If you are looking for sample projects showcasing all sensors and features supported by Edge Impulse out of the box, we have public firmware repos available for the Nordic Semiconductor nRF52840, nRF5340 and nRF9160 development kits as well as for the Thingy:91 and Thingy:53. See

Deploying your impulse

Head over to your Edge Impulse project, and go to the Deployment page. From here you can obtain a packaged library containing the Edge Impulse C++ SDK, your impulse, and all required external dependencies. Select C++ library and click Build to create the library.

Download the .zip file and extract the contents. Now copy the following directories to the 'example-standalone-inferencing-zephyr' folder (which you downloaded above).

  • edge-impulse-sdk

  • model-parameters

  • tflite-model

Your final folder structure should look like this:

 example-standalone-inferencing-zephyr
 ├── CMakeLists.txt
 ├── edge-impulse-sdk
 ├── model-parameters
 ├── prj.conf
 ├── README.md
 ├── sample.yaml
 ├── src
 ├── tflite-model
 └── utils

Running the impulse

To verify that the Zephyr application performs the same classification when running locally on your board, we need to use the same raw inputs as those provided to the Live classification for any given timestamp. To do so, click on the 'Copy to clipboard' button next to 'Raw features'. This will copy the raw values from this validation file, before any signal processing or inferencing happened.

Next, open src/main.cpp in the example directory and paste the raw features inside the static const float features[] definition. For example:

static const float features[] = {
    -19.8800, -0.6900, 8.2300, -17.6600, -1.1300, 5.9700, ...
};

And use west or your usual method to build the application:

# nRF52840 DK
$ west build -b nrf52840dk_nrf52840

# nRF5340 DK
$ west build -b nrf5340dk_nrf5340_cpuapp

# nRF9160DK
$ west build -b nrf9160dk_nrf9160ns

# Thingy:91
$ west build -b thingy91_nrf9160

Invalid choice: 'build'

If you try to build the application but it throws an 'invalid choice' error like:

$ west build -b nrf52840dk_nrf52840
usage: west [-h] [-z ZEPHYR_BASE] [-v] [-V] <command> ...
west: error: argument <command>: invalid choice: 'build' (choose from 'init', 'update', 'list', 'manifest', 'diff', 'status', 'forall', 'help', 'config', 'topdir', 'selfupdate')
$ zephyr\zephyr-env.cmd

On macOS / Linux

$ source zephyr/zephyr-env.sh

If you have set up the Segger J-LINK tools and the board that comes with J-LINK debug probe, you can also flash this application with:

$ west flash

otherwise if your board shows up as a mass storage device, you can find the build/zephyr/zephyr.bin file and drag it to the JLINK USB mass-storage device in the same way you do with a USB flash drive.

Boards such as Thingy:91 and Thingy:53 do not come with in built J-LINK debug probe, and cannot be used with west flash directly. They do include connector that enable users to connect with external J-LINK debug probe and take advantage of west flash command.

Seeing the output

To see the output of the impulse, connect to the development board over a serial port on baud rate 115,200 and reset the board. You can do this with your favourite serial monitor or with the Edge Impulse CLI:

$ edge-impulse-run-impulse --raw

This will show you the output of the signal processing pipeline and the results of the classification:

Edge Impulse standalone inferencing (Zephyr)
Running neural network...
Predictions (time: 1 ms.):
idle:   0.015319
snake:  0.000444
updown: 0.006182
wave:   0.978056
Anomaly score (time: 0 ms.): 0.133557
Predictions (DSP: 18 ms., Classification: 1 ms., Anomaly: 0 ms.): 
[0.01532, 0.00044, 0.00618, 0.97806, 0.134]

The output should match the values you just saw in the studio. If it does, you now have your impulse running on your Zephyr development board!

Connecting live sensors?

With the project ready it's time to verify that the application works. Head back to the studio and click on Live classification in the project you created for the tutorial, then load a testing sample, and click on a row under 'Detailed result'.

You'll need to set up your environment variables correctly (). You can do so by opening a command prompt or terminal window and running the commands below from the zephyr parent directory: On Windows

For the nRF9160DK, you also have to make sure the at least once.

Now that you have verified that the impulse works with hard-coded inputs, you should be ready to plug live sensors from your board. A demonstration on how to plug sensor values into the classifier can be found here: .

nRF52840DK
nRF Connect SDK v1.6.0
Continuous motion recognition
nRF Connect SDK
manual installation of the Zephyr build environment
GNU ARM Embedded Toolchain (version 9-2019-q4-major)
nRF comand line tools
Segger J-Link tools
west command line interface
Continuous motion recognition
download the application
edgeimpulse/firmware-nordic-nrf52840dk-nrf5340dk
edgeimpulse/firmware-nordic-nrf9160dk
edgeimpulse/firmware-nordic-thingy91
edgeimpulse/firmware-nordic-thingy53
continuous motion recognition
more info
Data forwarder - classifying data (Zephyr)
board controller has been flashed
Selecting the row with timestamp '320' under 'Detailed result'.
Copying the raw features.