Edge Impulse Docs

Edge Impulse Documentation

Welcome to the Edge Impulse documentation. You'll find comprehensive guides and documentation to help you start working with Edge Impulse as quickly as possible, as well as support if you get stuck. Let's jump right in!

Eta Compute ECM3532

🚧

Eta Compute ECM3532 AI Sensor

We now have full support, including data collection and deployment, for the Eta Compute ECM3532 AI Sensor. If you want to integrate an impulse in your firmware, see Running your impulse on your Eta Compute ECM3532.

The Eta Compute ECM3532 is a system-on-chip designed to support ultra-low power artificial intelligence for the internet of things. It features a 32-bit Arm® Cortex®-M3, NXP CoolFlux 16-bit DSP, 512kB Flash and 256kB + 96kB SRAM. It is built around a proprietary technology called Continuous Voltage Frequency Scaling, which allows the clock rate and voltage to be scaled at runtime to maximize energy efficiency.

The ECM3532 is an ideal platform for use with Edge Impulse. Developers build applications for the ECM3532 using the Eta Compute Tensai SDK, which comes with a sample project that uses Edge Impulse to classify gestures.

👍

Obtaining the device and SDK

To obtain an ECM3532 development board and the Tensai SDK, contact Eta Compute.

The ECM3532 development board with ST X-NUCLEO-IKS01A2 attached.

In this guide, we'll walk through the following:

  • Building and running the Edge Impulse example that is included with the Tensai SDK
  • Training your own Impulse using a pre-existing dataset
  • Deploying your new Impulse to the ECM3532

Running the Edge Impulse example

The Edge Impulse example is designed to classify physical gestures based on 3-axis accelerometer data. It collects four seconds of accelerometer data, runs it through a signal processing pipeline, then feeds the result into two machine learning models.

The first, a deep neural network, determines the probability that the gesture fits into one of four known types. The second, a k-means clustering model, determines whether the data represents an anomaly—a type of gesture that has not been seen before.

The result of these models is combined, allowing the application to understand whether the gesture is a known, valid gesture or something that it hasn't seen before. The application outputs the result by printing it to the onboard display and to the UART.

The example uses the ST X-NUCLEO-IKS01A2 expansion board, which provides an LSM6DSL accelerometer. This guide assumes that the board has been attached and its jumpers correctly configured.

Building the example

📘

Building on your platform

The following steps include unix command line instructions. The Tensai SDK documentation provides full instructions for building cross-platform.

The Edge Impulse example is located at the following path within the Tensai SDK:

TensaiSDK/soc/ecm3532/boards/eta_evb+um2121/examples/m3/edge_impulse

To build the example, cd into that directory and run make:

$ cd soc/ecm3532/boards/eta_evb+um2121/examples/m3/edge_impulse
$ make

Once the example has successfully built, follow the instructions in the ECM3532EVB User Guide to flash the binary to the ECM3532 using Segger J-Flash Lite.

Using the example application

Hit the blue button marked S1 to reset the board and start the program. Next, connect to the board using a serial terminal at 115200 baud, 1 stop, no parity. For cross-platform use, we recommend the @serialport/terminal package from NPM.

serialport-terminal -b 115200

The application runs in a loop. A typical output might look like this:

edge_impulse Example - in Flash

Reset Status = 0x1

Hello world, from Eta Compute and Edge Impulse!
M3 frequency = 80.5517 MHz

Capturing 4 seconds of data...
Finished capturing data

Running inferences: 2375ms window, 114ms increase
[ anomaly: 0.02, idle: 0.82, snake: 0.04, updown: 0.05, wave: 0.09 ]
[...more rows of results...]

Totals: [ idle: 14, snake: 0, updown: 0, wave: 0, anomaly: 0, uncertain: 0 ]

Detection result: Idle 🛌

The application captures four seconds of accelerometer data, then moves a sliding window across the data, taking consecutive subsamples. Each subsample is fed into the Edge Impulse classifier, which returns a prediction score for each category (idle, snake, updown, wave, or anomaly). If a category has a prediction score of >70%, it is considered a confident prediction. A score of less than 70% is considered uncertain.

After all of the subsamples have been classified, the total number of predictions for each category are added up, and the category with the most predictions is declared the winner.

Trying the gestures

Try performing the following gestures. You can see them animated in Demonstration movements.

  • Idle - sitting on your desk while you're working
  • Snake - moving the device over your desk in a snake-like motion
  • Wave - waving the device from left to right
  • Up/down - moving the device up and down

Training your own Impulse

To train your own impulse that can classify accelerometer data, walk through the following guides:

  1. Follow steps 1 to 6 in the Continuous motion recognition tutorial. We recommend using the pre-made dataset for best results.
  2. Once you get to step 6, select C++ library from the Deployment tab to download the Impulse as a C++ library that is compatible with the Eta Compute Tensai SDK.

The library comes in the form of a .zip file. Extract this to a temporary location. You will see it has the following contents:

In the Tensai SDK, delete the contents of the following directory:

TensaiSDK/3rd_party/edge_impulse

Replace it with the contents of the .zip you downloaded from Edge Impulse.

Now, to build the project, run Make from the following location:

$ cd soc/ecm3532/boards/eta_evb+um2121/examples/m3/edge_impulse
$ make

Flash the binary to your ECM3532 board. The application will now use the impulse you have just trained.

🚧

Troubleshooting impulse performance

If your new impulse is not performing as well as the original, you can try the following:

  • Capture more training data, and make sure you are performing the gestures as shown in Continuous gestures.
  • Play with the window size in on the Create impulse page. A value of around 2000ms should work well. A lower value means that more inferences are run for each four second sample, which can help with accuracy, but it's important that the window is long enough to capture the full range of motion of each gesture.

Updated 5 months ago

Eta Compute ECM3532


Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.