On your Mbed-enabled development board

Impulses can be deployed as a C++ library. This packages all your signal processing blocks, configuration and learning blocks up into a single package. You can include this package in your own application to run the impulse locally. In this tutorial you'll export an impulse, and build an Mbed OS application to classify sensor data.

Knowledge required

This tutorial assumes that you're familiar with Mbed OS, and have installed Mbed CLI. If you're unfamiliar with these tools you can build binaries directly for your development board from the Deployment page in the studio.

Note: Are you looking for an example that has all sensors included? The Edge Impulse firmware for the ST IoT Discovery Kit has that. See edgeimpulse/firmware-st-b-l475e-iot01a.

Prerequisites

Make sure you followed the Continuous motion recognition tutorial, and have a trained impulse. Also install the following software:

Cloning the base repository

We created an example repository which contains a small Mbed OS application, which takes the raw features as an argument, and prints out the final classification. Import this repository using Mbed CLI:

$ mbed import https://github.com/edgeimpulse/example-standalone-inferencing-mbed

Deploying your impulse

Head over to your Edge Impulse project, and go to Deployment. From here you can create the full library which contains the impulse and all external required libraries. Select C++ library and click Build to create the library.

Download the .zip file and place the contents in the 'example-standalone-inferencing-mbed' folder (which you downloaded above). Your final folder structure should look like this:

example-standalone-inferencing-mbed
|_ Makefile
|_ README.md
|_ build.sh
|_ edge-impulse-sdk
|_ model-parameters
|_ source
|_ tflite-model

Running the impulse

With the project ready it's time to verify that the application works. Head back to the studio and click on Live classification. Then load a validation sample, and click on a row under 'Detailed result'.

To verify that the local application classifies the same, we need the raw features for this timestamp. To do so click on the 'Copy to clipboard' button next to 'Raw features'. This will copy the raw values from this validation file, before any signal processing or inferencing happened.

Open main.cpp and paste the raw features inside the static const float features[] definition, for example:

static const float features[] = {
    -19.8800, -0.6900, 8.2300, -17.6600, -1.1300, 5.9700, ...
};

Then build and flash the application to your development board with Mbed CLI:

$ mbed compile -t GCC_ARM -m auto -f

Seeing the output

To see the output of the impulse, connect to the development board over a serial port on baud rate 115,200 and reset the board (e.g. by pressing the black button on the ST B-L475E-IOT01A. You can do this with your favourite serial monitor or with the Edge Impulse CLI:

$ edge-impulse-run-impulse --raw

This will run the signal processing pipeline, and then classify the output:

Edge Impulse standalone inferencing (Mbed)
Running neural network...
Predictions (time: 0 ms.):
idle:   0.015319
snake:  0.000444
updown: 0.006182
wave:   0.978056
Anomaly score (time: 0 ms.): 0.133557
run_classifier_returned: 0
[0.01532, 0.00044, 0.00618, 0.97806, 0.134]

Which matches the values we just saw in the studio. You now have your impulse running on your Mbed-enabled development board!

Connecting sensors?

A demonstration on how to plug sensor values into the classifier can be found here: Data forwarder - classifying data (Mbed OS).

Last updated