Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
The Ensemble series of fusion processors from Alif Semiconductor utilize ARM's low power Cortex-M55 CPUs with dedicated Ethos-U55 microNPUs to run embedded ML workloads quickly and efficiently. The devices feature both 'High Power' cores designed for large model architectures, as well as 'High Efficiency' cores designed for low power continuous monitoring. The E7 Development kit and E7 AppKit are both fully supported by Edge Impulse. The Ensemble E7 kits feature multiple core types, dual MEMS microphones, accelerometers, and a MIPI camera interface.
To get started with the Alif Ensemble E7 and Edge Impulse you'll need the following hardware:
To set this device up in Edge Impulse, you will need to install the following software:
The latest Alif Security Toolkit
:
Navigate to the Alif Semiconductor Kit documentation page (you will need to register to create an account with Alif, or log in to your existing Alif account). and download the latest App Security Toolkit (tested with version 0.56.0) for windows or linux.
Extract the archive, and read through the included Security Toolkit Quick Start Guide
to finalize the installation
(Optional) Docker Desktop:
If you are using MacOS, we recommended installing Docker Desktop in order to use the Alif Security Toolkit for programming.
Once you have installed it's time to connect the development board to Edge Impulse.
To interface the Alif Ensemble E7 AppKit or Development Kit, you'll need to make sure your hardware is properly configured and connected to your computer. Follow the steps below to prepare your specific kit for connection to Edge Impulse
After configuring the hardware, the next step is to flash the default Edge Impulse Firmware. This will allow us to collect data directly from your Ensemble device. To update the firmware:
Download the latest Edge Impulse firmware binary and unzip the file.
Navigate to the directory where you installed the Alif Security Toolkit
Copy the .bin
files from the Edge Impulse firmware directory into the build/images
directory of the Alif Security Toolkit
Copy all .json
files from the Edge Impulse firmware directory into the build/config
directory of the Alif Security Toolkit
From a command prompt or terminal, run the following commands:
Now, the Ensemble device can connect to the Edge Impulse CLI
installed earlier. To test the CLI for the first time, either:
Create a new project from the Edge Impulse project dashboard
OR
Clone an existing Edge Impulse public project, like this Face Detection Demo. Click the link and then press Clone
at the top right of the public project.
Then, from a command prompt or terminal on your computer, run:
Device choice
If you have a choice of serial ports and are not sure which one to use, pick /dev/tty.FTDI_USBtoUART or /dev/cu.usbserial-*. You may see two FTDI
serial ports enumerated for AppKit devices. If so, select the second entry in the list, which generally is the serial data connection to the Ensemble device.
If you see failures connecting to one serial port, make sure to test other serial connections just in case.
This will start a wizard which will ask you to log in and choose an Edge Impulse project. You should see your new or cloned project listed on the command line. Use the arrow keys and hit Enter
to select your project.
That's all! Your device is now connected to Edge Impulse. To verify this, go to
With everything set up you can now build your first machine learning model with these tutorials. This will walk you through the process of collecting data and training a new ML model:
Alternatively, you can test on-device inference with a demo model included in the base firmware binary. To do this, you may run the following command from your terminal:
Then, once you've tested out training and deployment with the Edge Impulse Firmware, learn how to integrate impulses with your own custom Ensemble based application:
The Nicla Sense ME is a tiny, low-power tool that sets a new standard for intelligent sensing solutions. With the simplicity of integration and scalability of the Arduino ecosystem, the board combines four state-of-the-art sensors from Bosch Sensortec:
BHI260AP motion sensor system with integrated AI.
BMM150 magnetometer.
BMP390 pressure sensor.
BME688 4-in-1 gas sensor with AI and integrated high-linearity, as well as high-accuracy pressure, humidity and temperature sensors.
Designed to easily analyze motion and the surrounding environment – hence the “M” and “E” in the name – it measures rotation, acceleration, pressure, humidity, temperature, air quality and CO2 levels by introducing completely new Bosch Sensortec sensors on the market.
Its tiny size and robust design make it suitable for projects that need to combine sensor fusion and AI capabilities on the edge, thanks to a strong computational power and low-consumption combination that can even lead to standalone applications when battery operated.
The Arduino Nicla Sense ME is available for around 55 USD from the .
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
With all the software in place it's time to connect the development board to Edge Impulse.
Use a micro-USB cable to connect the development board to your computer.
The development board does not come with the right firmware yet. To update the firmware:
Open the nicla_sense_ingestion.ino
sketch in a text editor or the Arduino IDE.
For data ingestion into your Edge Impulse project, at the top of the file, select 1 or multiple sensors by un-commenting the defines and select a desired sample frequency (in Hz). For example, for the Environmental sensors:
Then, from your sketch's directory, run the Arduino CLI to compile:
Then flash to your Nicla Sense using the Arduino CLI:
Wait until flashing is complete, and press the RESET button once to launch the new firmware.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. You will also name your sensor's axes (depending on which sensor you selected in your compiled nicla_sense_ingestion.ino
sketch). If you want to switch projects/sensors run the command with --clean
. Please refer to the table below for the names used for each axis corresponding to the type of sensor:
Note: These exact axis names are required to run the Edge Impulse Arduino library deployment example applications for the Nicla Sense without any changes.
Else, when deploying the model, you will see an error like the following:
If your axis names are different, when using the generated Arduino Library for the inference, you can modify the eiSensors nicla_sensors[]
(near line 70) in the sketch example to add your custom names. e.g.:
With the impulse designed, trained and verified you can deploy this model back to your Arduino Nicla Sense ME. This makes the model run without an internet connection, minimizes latency, and runs with minimum power consumption. Edge Impulse can package the complete impulse - including the signal processing code, neural network weights, and classification code - up into a single library that you can run on your development board.
.
.
Here's an .
The has instructions for macOS and Linux.
See the guide.
.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with the .
Looking to connect different sensors? Use the nicla_sense_ingestion
sketch and the Edge Impulse to easily send data from any sensor on the Nicla Sense into your Edge Impulse project.
Use the tutorial and select one of the Nicla Sense examples.
Development Boards | Officially Supported Sensors | Memory** | Storage*** | Architecture |
---|
Different development board or different sensors? No problem, you can always collect data using the or the SDK, and deploy your model back to the device with the tutorials. Also, if you feel like porting your board, use this .
Sensor | Axis names |
| accX, accY, accZ |
| gyrX, gyrY, gyrZ |
| heading, pitch, roll |
| temperature, barometer, humidity, gas |
| rotX, rotY, rotZ, rotW |
| 4MB | 4MB | Cortex-M55 400MHz + U55-256MACC |
| 256KB | 1MB | Cortex-M4F 64MHz |
| 64KB | 512KB | Cortex-M4 64MHz |
| 128KB | 1MB | Cortex-M7 480MHz |
| 512KB | 2MB | Cortex-M7 480MHz |
| 4MB | 4MB | ESP32 240MHz |
| 2MB | 2MB | ARC DSP 400MHz |
| 1MB | 2MB | Cortex-M4 150MHz + Cortex-M0+ 100MHz |
| 256KB | 1MB | Cortex-M4F 64MHz |
| 512KB | 1MB | Cortex-M33 128MHz |
| 256KB | 1MB | Cortex-M33 64MHz |
| 512KB | 1MB | Cortex-M33 128MHz |
| 256KB | 1MB | Cortex-M33 64MHz |
| 32MB SDRAM / 1MB SRAM | 32MB external / 2MB internal | Cortex-M7 480MHz |
| 512KB | 2MB | Cortex-M33 200MHz |
| 2MB | 2MB | ARC DSP 400MHz |
| 2MB | 2MB | ARC DSP 400MHz |
| 256KB | 1MB | Cortex-M4F 40MHz |
| 256KB | 1.5MB | Cortex-M33 78MHz |
| 1.5MB | 8MB | Cortex-M4F 156MHz |
| 128KB | 1MB | Cortex-M4F 80MHz |
|
| 32KB | 256KB | SAMD21 Cortex-M0+ |
| 80KB | 352KB | Cortex-M4F 48MHz |
| 256KB | 2MB | Cortex-M0+ 133MHz |
The Arduino Nicla Voice is a development board with a high-performance microphone and IMU, a Cortex-M4 Nordic nRF52832 MCU and the Syntiant NDP120 Neural Decision Processor™. The NDP120 supports multiple Neural Network architectures and is ideal for always-on low-power speech recognition applications. You'll be able to sample raw data, build models, and deploy trained embedded machine learning models directly from the Edge Impulse studio to create the next generation of low-power, high-performance audio interfaces.
The Edge Impulse firmware for this development board is open source and hosted on GitHub.
To set this device up in Edge Impulse, you will need to install the following software:
Download the Nicla Voice firmware for audio or IMU below and connect the USB cable to your computer:
The archive contains different scripts to flash the firmware on your OS, ie for macOS:
install_lib_mac.command: script will install the Arduino Core for the Nicla board and the pyserial package required to update the NDP120 chip. You only need to run this script once.
flash_mac.command: to flash both the MCU and NDP120 chip. You should use this script on a brand new board
The additional scripts below can be used for specific actions:
flash_mac_mcu.command: to flash only the Nordic MCU, ie if you recompiled the firmware and doesn't need to update the NDP120 model.
flash_mac_model.command: to flash only the NDP120 model.
format_mac_ext_flash.command: to format the external flash that contains the NDP120 model
After flashing the MCU and NDP chips, connect the Nicla Voice directly to your computer's USB port. Linux, Mac OS, and Windows platforms are supported. From a command prompt or terminal, run:
This will start a wizard which will ask you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
Use syntiant compatible pre-processing blocks
The Arduino Nicla Voice is based on the Syntiant NDP120 Neural Decision Processor™ and needs to use dedicated Syntiant DSP blocks.
With everything set up you can now build your first machine learning model and evaluate it using the Arduino Nicla Voice Board with this tutorial:
How to use Arduino-CLI with macOS M1 chip? You will need to install Rosetta2 to run the Arduino-CLI. See details on Apple website.
How to label my classes? The NDP chip expects one and only negative class and it should be the last in the list. For instance, if your original dataset looks like: yes, no, unknown, noise
and you only want to detect the keyword 'yes' and 'no', merge the 'unknown' and 'noise' labels in a single class such as z_openset
(we prefix it with 'z' in order to get this class last in the list).
The Arduino Nano 33 BLE Sense is a tiny development board with a Cortex-M4 microcontroller, motion sensors, a microphone and BLE - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio. It's available for around 30 USD from Arduino and a wide range of distributors.
You can also use the Arduino Tiny Machine Learning Kit to run image classification models on the edge with the Arduino Nano and attached OV7675 camera module (or connect the hardware together via jumper wire and a breadboard if purchased separately).
The Edge Impulse firmware for this development board is open source and hosted on GitHub: edgeimpulse/firmware-arduino-nano-33-ble-sense.
Arduino Nano 33 BLE Sense rev2?
Arduino recently released a new version of the Arduino Nano 33 BLE Sense, the rev2 version which has different sensors than the original version. We are working on adding a dedicated "official firmware" so you can easily flash this board version. In the meantime, to ingest data, please have a look at Data Ingestion (API), Data Forwarder (CLI) or Data Uploader (CLI and Studio)
To set this device up in Edge Impulse, you will need to install the following software:
Here's an instruction video for Windows.
The Arduino website has instructions for macOS and Linux.
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
See the Installation and troubleshooting guide.
With all the software in place it's time to connect the development board to Edge Impulse.
Use a micro-USB cable to connect the development board to your computer. Then press RESET twice to launch into the bootloader. The on-board LED should start pulsating to indicate this.
The development board does not come with the right firmware yet. To update the firmware:
Download the latest Edge Impulse firmware, and unzip the file.
Open the flash script for your operating system (flash_windows.bat
, flash_mac.command
or flash_linux.sh
) to flash the firmware.
Wait until flashing is complete, and press the RESET button once to launch the new firmware.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
You will need the following hardware:
Arduino Nano 33 BLE Sense board with headers.
OV7675 camera module.
Micro-USB cable.
Solderless breadboard and female-to-male jumper wires.
First, slot the Arduino Nano 33 BLE Sense board into a solderless breadboard:
With female-to-male jumper wire, use the following wiring diagram, pinout diagrams, and connection table to link the OV7675 camera module to the microcontroller board via the solderless breadboard:
Download the full pinout diagram of the Arduino Nano 33 BLE Sense here.
Finally, use a micro-USB cable to connect the Arduino Nano 33 BLE Sense development board to your computer.
Now build & train your own image classification model and deploy to the Arduino Nano 33 BLE Sense with Edge Impulse!
The Nicla Vision is a ready-to-use, standalone camera for analyzing and processing images on the Edge. Thanks to its 2MP color camera, smart 6-axis motion sensor, integrated microphone, and distance sensor, it is suitable for asset tracking, object recognition, and predictive maintenance. Some of its key features include:
Powerful microcontroller equipped with a 2MP color camera
Tiny form factor of 22.86 x 22.86 mm
Integrated microphone, distance sensor, and intelligent 6-axis motion sensor
Onboard Wi-Fi and Bluetooth® Low Energy connectivity
Standalone when battery-powered
Expand existing projects with sensing capabilities
Enable fast Machine Vision prototyping
Compatible with Nicla, Portenta, and MKR products
Its exceptional capabilities are supported by a powerful STMicroelectronics STM32H747AII6 Dual ARM® Cortex® processor, combining an M7 core up to 480 Mhz and an M4 core up to 240 Mhz. Despite its industrial strength, it keeps energy consumption low for battery-powered standalone applications.
The Arduino Nicla Vision is available for around 95 EUR from the Arduino Store.
To set this device up in Edge Impulse, you will need to install the following software:
Here's an instruction video for Windows.
The Arduino website has instructions for macOS and Linux.
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
See the Installation and troubleshooting guide.
There are two ways to connect the Nicla Vision to Edge Impulse:
Using the official Edge Impulse firmware - it supports all onboard sensors, including camera.
Using an ingestion script. This supports analog, IMU, proximity sensors and microphone (limited to 8 kHz), but not the camera. It is only recommended if you want to modify the ingestion flow for third-party sensors.
Use a micro-USB cable to connect the development board to your computer. Under normal circumstances, flash process should work without entering the bootloader manually. However if run into difficulties flashing the board, you can enter the bootloader by pressing RESET twice. The onboard LED should start pulsating to indicate this.
The development board does not come with the right firmware yet. To update the firmware:
Download the latest Edge Impulse firmware, and unzip the file.
Open the flash script for your operating system (flash_windows.bat
, flash_mac.command
or flash_linux.sh
) to flash the firmware.
Wait until flashing is complete, and press the RESET button once to launch the new firmware.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
Use a micro-USB cable to connect the development board to your computer.
The development board does not come with the right firmware yet. To update the firmware:
Download the latest Edge Impulse ingestion sketches and unzip the file.
Open the nicla_vision_ingestion.ino
(for IMU/proximity sensor) or nicla_vision_ingestion_mic.ino
(for microphone) sketch in a text editor or the Arduino IDE.
For IMU/proximity sensor data ingestion into your Edge Impulse project, at the top of the file, select 1 or multiple sensors by un-commenting the defines and select the desired sample frequency (in Hz). For example, for the accelerometer sensor:
For microphone data ingestion, you do not need to change the default parameters in the nicla_vision_ingestion_mic.ino
sketch.
Then, from your sketch's directory, run the Arduino CLI to compile:
Then flash to your Nicla Vision using the Arduino CLI:
Alternatively if you open the sketch in the Arduino IDE, you can compile and upload the sketch from there.
Wait until flashing is complete, and press the RESET button once to launch the new firmware.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. You will also name your sensor's axes (depending on which sensor you selected in your compiled nicla_vision_ingestion.ino
sketch). If you want to switch projects/sensors run the command with --clean
. Please refer to the table below for the names used for each axis corresponding to the type of sensor:
Note: These exact axis names are required for the Edge Impulse Arduino library deployment example applications for the Nicla Vision.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in and choose an Edge Impulse project. You will also name your sensor axes - in the case of the microphone, you need to enter audio
. If you want to switch projects/sensors run the command with --clean
.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
The above screenshots are for Edge Impulse Ingestion scripts and Data forwarder. If you use the official Edge Impulse firmware for the Nicla Vision, the content will be slightly different.
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? Use the nicla_vision_ingestion.ino
sketch and the Edge Impulse data forwarder to easily send data from any sensor on the Nicla Vision into your Edge Impulse project.
With the impulse designed, trained and verified you can deploy this model back to your Arduino Nicla Vision. This makes the model run without an internet connection, minimizes latency, and runs with minimum power consumption. Edge Impulse can package the complete impulse - including the signal processing code, neural network weights, and classification code - up into a single library that you can run on your development board.
Use the Running your impulse locally: On your Arduino tutorial and select one of the Nicla Vision examples.
CY8CKIT-062S2 Pioneer Kit and CY8CKIT-028-SENSE expansion kit required
This guide assumes you have the IoT sense expansion kit (CY8CKIT-028-SENSE) attached to a PSoC® 62S2 Wi-Fi® BLUETOOTH® Pioneer Kit
The Infineon Semiconductor PSoC® 62S2 Wi-Fi® BLUETOOTH® Pioneer Kit (Cypress CY8CKIT-062S2) enables the evaluation and development of applications using the PSoC 62 Series MCU. This low-cost hardware platform enables the design and debug of the PSoC 62 MCU and the Murata 1LV Module (CYW43012 Wi-Fi + Bluetooth Combo Chip). The PSoC 6 MCU is Infineon' latest, ultra-low-power PSoC specifically designed for wearables and IoT products. The board features a PSoC 6 MCU, and a CYW43012 Wi-Fi/Bluetooth combo module. Infineon CYW43012 is a 28nm, ultra-low-power device that supports single-stream, dual-band IEEE 802.11n-compliant Wi-Fi MAC/baseband/radio and Bluetooth 5.0 BR/EDR/LE. When paired with the IoT sense expansion kit, the PSoC® 62S2 Wi-Fi® BLUETOOTH® Pioneer Kit can be used to easily interface a variety of sensors with the PSoC™ 6 MCU platform, specifically targeted for audio and machine learning applications which are fully supported by Edge Impulse! You'll be able to sample raw data as well as build and deploy trained machine learning models to your PSoC® 62S2 Wi-Fi® BLUETOOTH® Pioneer Kit, directly from the Edge Impulse Studio.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: edgeimpulse/firmware-infineon-cy8ckit-062s2.
To set this device up with Edge Impulse, you will need to install the following software:
Infineon CyProgrammer. A utility program we will use to flash firmware images onto the target.
The Edge Impulse CLI which will enable you to connect your CY8CKIT-062S2 Pioneer Kit directly to Edge Impulse Studio, so that you can collect raw data and trigger in-system inferences.
Problems installing the CLI?
See the Installation and troubleshooting guide.
Edge Impulse Studio can collect data directly from your CY8CKIT-062S2 Pioneer Kit and also help you trigger in-system inferences to debug your model, but in order to allow Edge Impulse Studio to interact with your CY8CKIT-062S2 Pioneer Kit you first need to flash it with our base firmware image.
Download the latest Edge Impulse firmware, and unzip the file. Once downloaded, unzip it to obtain the firmware-infineon-cy8ckit-062s2.hex
file, which we will be using in the following steps.
Use a micro-USB cable to connect the CY8CKIT-062S2 Pioneer Kit to your development computer (where you downloaded and installed Infineon CyProgrammer).
You can use Infineon CyProgrammer to flash your CY8CKIT-062S2 Pioneer Kit with our base firmware image. To do this, first select your board from the dropdown list on the top left corner. Make sure to select the item that starts with CY8CKIT-062S2-43012
:
Then select the base firmware image file you downloaded in the first step above (i.e., the file named firmware-infineon-cy8ckit-062s2.hex
). You can now press the Connect
button to connect to the board, and finally the Program
button to load the base firmware image onto the CY8CKIT-062S2 Pioneer Kit.
Keep Infineon CyProgrammer Handy
Infineon CyProgrammer will be needed to upload any other project built on Edge Impulse, but the base firmware image only has to be loaded once.
With all the software in place, it's time to connect the CY8CKIT-062S2 Pioneer Kit to Edge Impulse.
Use a micro-USB cable to connect the development board to your computer.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices on the left sidebar. The device will be listed there:
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
Espressif ESP-EYE (ESP32) is a compact development board based on Espressif's ESP32 chip, equipped with a 2-Megapixel camera and a microphone. ESP-EYE also offers plenty of storage, with 8 MB PSRAM and 4 MB SPI flash - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio. It's available for around 22 USD from Mouser and a wide range of distributors.
There are plenty of other boards built with ESP32 chip - and of course there are custom designs utilizing ESP32 SoM. Edge Impulse firmware was tested with ESP-EYE and ESP FireBeetle boards, but there is a possibility to modify the firmware to use it with other ESP32 designs. Read more on that in Using with other boards section of this documentation.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: edgeimpulse/firmware-espressif-esp32.
To set this device up in Edge Impulse, you will need to install the following software:
Python 3.
The ESP documentation website has instructions for macOS and Linux.
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
See the Installation and troubleshooting guide.
With all the software in place it's time to connect the development board to Edge Impulse.
Use a micro-USB cable to connect the development board to your computer.
The development board does not come with the right firmware yet. To update the firmware:
Download the latest Edge Impulse firmware, and unzip the file.
Open the flash script for your operating system (flash_windows.bat
, flash_mac.command
or flash_linux.sh
) to flash the firmware.
Wait until flashing is complete.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
The standard firmware supports the following sensors:
Camera: OV2640, OV3660, OV5640 modules from Omnivision
Microphone: I2S microphone on ESP-EYE (MIC8-4X3-1P0)
LIS3DHTR module connected to I2C (SCL pin 22, SDA pin 21)
Any analog sensor, connected to A0
The analog sensor and LIS3DHTR module were tested on ESP32 FireBeetle board and Grove LIS3DHTR module.
ESP32 is a very popular chip both in a community projects and in industry, due to its high performance, low price and large amount of documentation/support available. There are other camera enabled development boards based on ESP32, which can use Edge Impulse firmware after applying certain changes, e.g.
AI-Thinker ESP-CAM
M5STACK ESP32 PSRAM Timer Camera X (OV3660)
M5STACK ESP32 Camera Module Development Board (OV2640)
The pins used for camera connection on different development boards are not the same, therefore you will need to change the #define here to fit your development board, compile and flash the firmware. Specifically for AI-Thinker ESP-CAM, since this board needs an external USB to TTL Serial Cable to upload the code/communicate with the board, the data transfer baud rate must be changed to 115200 here.
The analog sensor and LIS3DH accelerometer can be used on any other development board without changes, as long as the interface pins are not changed. If I2C/ADC pins that accelerometer/analog sensor are connected to are different, from described in Sensors available section, you will need to change the values in LIS3DHTR component for ESP32, compile and flash it to your board.
Additionally, since Edge Impulse firmware is open-source and available to public, if you have made modifications/added new sensors capabilities, we encourage you to make a PR in firmware repository!
To deploy your impulse on your ESP32 board, please see:
Generate an Edge Impulse firmware (ESP32-EYE only)
Download a C++ library (using ESP-IDF)
Download an Arduino library
The Himax WE-I Plus is a tiny development board with a camera, a microphone, an accelerometer and a very fast DSP - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio. It's available for around 65 USD from Sparkfun.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: edgeimpulse/firmware-himax-we-i-plus.
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
See the Installation and troubleshooting guide.
With all the software in place it's time to connect the development board to Edge Impulse.
Use a micro-USB cable to connect the development board to your computer.
The development board does not come with the right firmware yet. To update the firmware:
Download the latest Edge Impulse firmware, and unzip the file.
Open the flash script for your operating system (flash_windows.bat
, flash_mac.command
or flash_linux.sh
) to flash the firmware.
Wait until flashing is complete, and press the RESET button once to launch the new firmware.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
If you export to the Himax WE-I Plus you could receive the error: "All licenses are in use by other developers.". Unfortunately we have a limited number of licenses for the MetaWare compiler and these are shared between all Studio users. Try again in a little bit, or export your project as a C++ Library, add it to the edgeimpulse/firmware-himax-we-i-plus project and compile locally.
If no device shows up in your OS (ie: COMxx, /dev/tty.usbxx) after connecting the board and your USB cable supports data transfer, you may need to install FTDI VCP driver.
The Portenta H7 is a powerful development board from Arduino with both a Cortex-M7 microcontroller and a Cortex-M4 microcontroller, a BLE/WiFi radio, and an extension slot to connect the Portenta vision shield - which adds a camera and dual microphones. At the moment the Portenta H7 is partially supported by Edge Impulse, letting you collect data from the camera, build computer vision models, and deploy trained machine learning models back to the development board. The Portenta H7 and the vision shield are available directly from Arduino for ~$150 in total.
There are two versions of the vision shield: one that has an Ethernet connection and one with a LoRa radio. Both of these can be used with Edge Impulse.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: edgeimpulse/firmware-arduino-portenta-h7.
To set this device up in Edge Impulse, you will need to install the following software:
Here's an instruction video for Windows.
The Arduino website has instructions for macOS and Linux.
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
See the Installation and troubleshooting guide.
With all the software in place it's time to connect the development board to Edge Impulse.
Using the vision shield using two edge connectors on the back Portenta H7.
Use a USB-C cable to connect the development board to your computer. Then, double-tap the RESET button to put the device into bootloader mode. You should see the green LED on the front pulsating.
The development board does not come with the right firmware yet. To update the firmware:
Download the latest Edge Impulse firmware, and unzip the file.
Double press on the RESET button on your board to put it in the bootloader mode.
Open the flash script for your operating system (flash_windows.bat
, flash_mac.command
or flash_linux.sh
) to flash the firmware.
Wait until flashing is complete, and press the RESET button once to launch the new firmware.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
Download your custom firmware from the Deployment tab in the Studio and install the firmware with the same method as in the "Update the firmware" section and run the edge-impulse-run-impulse
command:
Note that it may take up to 10 minutes to compile the firmware for the Arduino Portenta H7
Use the Running your impulse locally: On your Arduino tutorial and select one of the portenta examples:
For an end-to-end example that classifies data and then sends the result over LoRaWAN. Please see the example-portenta-lorawan example.
If you come across this issue:
You probably forgot to double press the RESET button before running the flash script.
The OpenMV Cam is a small and low-power development board with a Cortex-M7 microcontroller supporting MicroPython, a μSD card socket and a camera module capable of taking 5MP images - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models through the studio and the OpenMV IDE. It is available for 80 USD directly from .
To set this device up in Edge Impulse, you will need to install the following software:
Problems installing the CLI?
With all the software in place it's time to connect the development board to Edge Impulse. To make this easy we've put some tutorials together which takes you through all the steps to acquire data, train a model, and deploy this model back to your device.
The Nordic Semiconductor nRF5340 DK is a development board with dual Cortex-M33 microcontrollers, QSPI flash, and an integrated BLE radio - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio. As the nRF5340 DK does not have any built-in sensors we recommend you to pair this development board with the shield (with a MEMS accelerometer and a MEMS microphone). The nRF5340 DK is available for around 50 USD from a .
If you don't have the X-NUCLEO-IKS02A1 shield you can use the to capture data from any other sensor, and then follow the tutorial to run your impulse. Or, you can modify the example firmware (based on nRF Connect) to interact with other accelerometers or PDM microphones that are supported by Zephyr.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
With all the software in place it's time to connect the development board to Edge Impulse.
Remove the pin header protectors on the nRF5340 DK and plug the X-NUCLEO-IKS02A1 shield into the development board.
Note: Make sure that the shield does not touch any of the pins in the middle of the development board. This might cause issues when flashing the board or running applications.
Use a micro-USB cable to connect the development board to your computer. There are two USB ports on the development board, use the one on the short side of the board. Then, set the power switch to 'on'.
The development board does not come with the right firmware yet. To update the firmware:
The development board is mounted as a USB mass-storage device (like a USB flash drive), with the name JLINK
. Make sure you can see this drive.
Drag the nrf5340-dk.bin
file to the JLINK
drive.
Wait 20 seconds and press the BOOT/RESET button.
From a command prompt or terminal, run:
This starts a wizard which asks you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
The nRF5340 DK exposes multiple UARTs. If prompted, choose the bottom one:
With everything set up you can now build your first machine learning model with these tutorials:
If your board fails to flash new firmware (a FAIL.txt
file might appear on the JLINK
drive) you can also flash using nrfjprog
.
Flash new firmware via:
The Nordic Semiconductor nRF9160 DK is a development board with an nRF9160 SIP incorporating a Cortex M-33 for your application, a full LTE-M/NB-IoT modem with GPS along with 1 MB of flash and 256 KB RAM. It also includes an nRF52840 board controller with Bluetooth Low Energy connectivity. The Development Kit is fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio. As the nRF9160 DK does not have any built-in sensors we recommend you to pair this development board with the shield (with a MEMS accelerometer and a MEMS microphone). The nRF9160 DK is available for around 150 USD from a variety of distributors including .
If you don't have the X-NUCLEO-IKS02A1 shield you can use the to capture data from any other sensor, and then follow the tutorial to run your impulse. Or, you can modify the example firmware (based on nRF Connect) to interact with other accelerometers or PDM microphones that are supported by Zephyr.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
With all the software in place it's time to connect the development board to Edge Impulse.
Remove the pin header protectors on the nRF9160 DK and plug the X-NUCLEO-IKS02A1 shield into the development board.
Note: Make sure that the shield does not touch any of the pins in the middle of the development board. This might cause issues when flashing the board or running applications. You can also remove the shield before flashing the board.
Use a micro-USB cable to connect the development board to your computer. There are two USB ports on the development board, use the one on the short side of the board. Then, set the power switch to 'on'.
The development board does not come with the right firmware yet. To update the firmware:
The development board is mounted as a USB mass-storage device (like a USB flash drive), with the name JLINK
. Make sure you can see this drive.
Flash the board controller, you only need to do this once. Go to step 4 if you've performed this step before.
Ensure that the PROG/DEBUG
switch is in nRF52
position.
Copy board-controller.bin
to the JLINK
mass storage device.
Flash the application:
Ensure that the PROG/DEBUG
switch is in nRF91
position.
Run the flash script for your Operating System.
Wait 20 seconds and press the BOOT/RESET button.
From a command prompt or terminal, run:
This starts a wizard which asks you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
The nRF9160 DK exposes multiple UARTs. If prompted, choose the top one:
With everything set up you can now build your first machine learning model with these tutorials:
The Nordic Semiconductor Thingy:91 is an easy-to-use battery-operated prototyping platform for cellular IoT using LTE-M, NB-IoT and GPS. It is ideal for creating Proof-of-Concept (PoC), demos and initial prototypes in your cIoT development phase. Thingy:91 is built around the and is certified for a broad range of LTE bands globally, meaning the Nordic Thingy:91 can be used just about anywhere in the world. There is an multiprotocol SoC on the Thingy:91. This offers the option of adding Bluetooth Low Energy connectivity to your project.
Nordic's Thingy:91 is fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio. Thingy:91 is available for around 120 USD from a .
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
Before you start a new project, you need to update the Thingy:91 firmware to our latest build.
Use a micro-USB cable to connect the development board to your computer. Then, set the power switch to 'on'.
firmware.hex
: the Edge Impulse firmware image for the nRF9160 SoC, and
connectivity-bridge.hex
: a connectivity application for the nRF52840 that you only need on older boards (hardware version < 1.4)
Open nRF Connect for Desktop and launch the Programmer application.
Scroll down in the menu on the right and make sure Enable MCUboot is selected.
Switch off the Nordic Thingy:91.
Press the multi-function button (SW3) while switching SW1 to the ON position.
In the Programmer navigation bar, click Select device.
In the menu on the right, click Add HEX file > Browse, and select the firmware.hex file from the firmware previously downloaded.
Scroll down in the menu on the right to Device and click Write:
In the MCUboot DFU window, click Write. When the update is complete, a Completed successfully message appears.
You can now disconnect the board.
Thingy:91 hardware version < 1.4.0
Updating the firmware with older hardware versions may fail. Moreover, even if the update works, the device may later fail to connect to Edge Impulse Studio:
With all the software in place it's time to connect the development board to Edge Impulse. From a command prompt or terminal, run:
This starts a wizard which asks you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
The Thingy:91 exposes multiple UARTs. If prompted, choose the first one:
With everything set up you can now build your first machine learning model with this tutorial:
The Nordic Semiconductor nRF52840 DK is a development board with a Cortex-M4 microcontroller, QSPI flash, and an integrated BLE radio - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio. As the nRF52840 DK does not have any built-in sensors we recommend you to pair this development board with the shield (with a MEMS accelerometer and a MEMS microphone). The nRF52840 DK is available for around 50 USD from a variety of distributors including .
If you don't have the X-NUCLEO-IKS02A1 shield you can use the to capture data from any other sensor, and then follow the tutorial to run your impulse. Or, you can modify the example firmware (based on nRF Connect) to interact with other accelerometers or PDM microphones that are supported by Zephyr.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
With all the software in place it's time to connect the development board to Edge Impulse.
Remove the pin header protectors on the nRF52840 DK and plug the X-NUCLEO-IKS02A1 shield into the development board.
Note: Make sure that the shield does not touch any of the pins in the middle of the development board. This might cause issues when flashing the board or running applications.
Use a micro-USB cable to connect the development board to your computer. There are two USB ports on the development board, use the one on the short side of the board. Then, set the power switch to 'on'.
The development board does not come with the right firmware yet. To update the firmware:
The development board is mounted as a USB mass-storage device (like a USB flash drive), with the name JLINK
. Make sure you can see this drive.
Drag the nrf52840-dk.bin
file to the JLINK
drive.
Wait 20 seconds and press the BOOT/RESET button.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
With everything set up you can now build your first machine learning model with these tutorials:
If you don't see the JLINK
drive show up when you connect your nRF52840 DK you'll have to update the interface firmware.
Set the power switch to 'off'.
Hold BOOT/RESET while you set the power switch to 'on'.
Your development board should be mounted as BOOTLOADER
.
After 20 seconds disconnect the USB cable, and plug the cable back in.
The development board should now be mounted as JLINK
.
If your board fails to flash new firmware (a FAIL.txt
file might appear on the JLINK
drive) you can also flash using nrfjprog
.
Flash new firmware via:
The Nordic Thingy:53™ is an easy-to-use prototyping platform, it makes it possible to create prototypes and proof-of-concepts without the need to build custom hardware. Thingy:53 is built around the . The capacity of its dual Arm Cortex-M33 processors enables it to do embedded machine learning (ML), both collecting data and running trained ML models on the device. The Bluetooth Low Energy radio allows it to connect to smart phones, tablets, laptops and similar devices, without the need for a wired connection. Other protocols like Thread, Zigbee and proprietary 2.4 GHz protocols are also supported by the radio. It also includes a well of different integrated sensors, an NFC antenna, and has two buttons and one RGB LED that simplifies input and output.
Nordic's Thingy:53 is fully supported by Edge Impulse and every Thingy:53 is shipped with already flashed. You'll be able to sample raw data, build models, and deploy trained machine learning models directly out-of-the-box via the Edge Impulse Studio or the Nordic nRF Edge Impulse and apps over BLE connection. The Thingy:53 is available for around 120 USD from a .
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
To set this device up in Edge Impulse via USB serial or external debug probe, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
Use a USB cable to connect the development board to your computer. Then, set the power switch to 'on'.
Download the latest Edge Impulse firmware:
*-full.zip
contains HEX files to upgrade the device through the external probe.
*-dfu.zip
contains dfu_application.zip
package to upgrade the already flashed device through the Serial/USB bootloader.
With all the software in place it's time to connect the development board to Edge Impulse. From a command prompt or terminal, run:
This starts a wizard which asks you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
If prompted to select a device, choose ZEPHYR
:
With everything set up you can now build your first machine learning model with this tutorial:
Now that you have created an Edge Impulse account and trained your first Edge Impulse machine learning model, using the Nordic nRF Edge Impulse app you can deploy your impulse to your Nordic Thingy:53 and acquire/upload new sensor data into your Edge Impulse projects.
Select the Devices tab to connect to your Thingy:53 device to your mobile phone:
To remove your connected Thingy:53 from your project, select the connected device name and scroll to the bottom of the device page to remove it.
To view existing data samples in your Edge Impulse project, select the Data Acquisition tab. To record and upload a new data sample into your project, click on the "+" button at the top right of the app. Select your sensor, type in the sample label, and choose a sample length and frequency, then select Start Sampling.
Build and deploy your Edge Impulse model to your Thingy:53 via the Deployment tab. Select your project from the top drop-down, select your connected Thingy:53 device, and click Build:
The app will start building your project and uploading the firmware to the connected Thingy:53:
Select the Inferencing tab to view the inferencing results of the model flashed to the connected Thingy:53:
Select the Settings tab to view your logged-in account information, BLE scanner settings, and application version. Click on your account name to view your Edge Impulse projects and logout of your account.
Lost BLE connection to device
Reconnect your device by selecting your device name on the Devices tab and click "Reconnect".
Make sure power cables are plugged in properly.
Do not use iPhone/Android app multitasking during data acquisition, firmware deployment, or inferencing tasks, as the BLE streaming connection will be closed.
.
.
See the guide.
- end-to-end tutorial.
.
- collecting datasets using the OpenMV IDE.
- run your trained impulse on the OpenMV Cam H7 Plus.
.
See the guide.
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
.
.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
Install the .
.
See the guide.
Install the .
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
.
.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
- install exactly version 3.7.1, please follow the below instructions to downgrade or newly install v3.71:
.
See the guide.
. The extracted archive contains the following files:
In these cases, you will also need to flash the connectivity-bridge.hex
onto the nRF52840 in the Thingy:91. Follow the with the connectivity-bridge.hex
file through USB.
If this method doesn't work, you will need to ."
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
.
See the guide.
If this is not the case, see at the bottom of this page.
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
.
.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
Download the latest and drag the .bin
file onto the BOOTLOADER
drive.
Install the .
(only needed to update device firmware through USB or external debug probe).
.
See the guide.
Brand new Thingy:53 devices will work out-of-the-box with the Edge Impulse Studio and the Nordic nRF Edge Impulse and apps. However, if your device has been flashed with some other firmware, then follow the steps below to update your device to the latest Edge Impulse firmware.
to update the firmware on the Thingy:53 through your choice of debugging connection:
.
.
.
See the section below on .
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
Download and install the Nordic nRF Edge Impulse app for your or phone.
Open the app and login with your edgeimpulse.com credentials:
Select your Thingy:53 project from the drop-down menu at the top:
If you encounter connection errors during deployment, please see .
Every Thingy:53 is shipped with a default Edge Impulse model. This model is created from the and it's .
Sensor
Axis names
#define SAMPLE_ACCELEROMETER
accX, accY, accZ
#define SAMPLE_GYROSCOPE
gyrX, gyrY, gyrZ
#define SAMPLE_PROXIMITY
cm
The Renesas CK-RA6M5, Cloud Kit for RA6M5 MCU Group, enables users to experience the cloud connectivity options available from Renesas and Renesas Partners. A broad array of sensors on the CK-RA6M5 provide multiple options for observing user interaction with the Cloud Kit. By selecting from a choice of add-on devices, multiple cloud connectivity options are available.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: edgeimpulse/firmware-renesas-ck-ra6m5.
An earlier prototype version of the Renesas CK-RA6M5 Cloud Kit is also supported. The layout of this earlier prototype version is available here.
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
See the Installation and troubleshooting guide.
Edge Impulse Studio can collect data directly from your CK-RA6M5 Cloud Kit and also help you trigger in-system inferences to debug your model, but in order to allow Edge Impulse Studio to interact with your CK-RA6M5 Cloud Kit you first need to flash it with our base firmware image.
Download the latest Edge Impulse firmware, and unzip the file, then locate the flash-script
folder included, which we will be using in the following steps.
Check that:
J22 is set to link pins 2-3
J21 link is closed
J16 Link is open
Connect J14 and J20 on the CK-RA6M5 board to USB ports on the host PC using the 2 micro USB cables supplied.
Power LED (LED6) on the CK-RA6M5 board lights up white, indicating that the CK-RA6M5 board is powered on.
If the CK-RA6M5 board is not powered through the Debug port (J14) the current available to the board may be limited to 100 mA.
Open the flash script for your operating system (flash_windows.bat
, flash_mac.command
or flash_linux.sh
) to flash the firmware.
An earlier prototype version of the Renesas CK-RA6M5 Cloud Kit required a USB to Serial interface as shown here. This is no longer the case.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices on the left sidebar. The device will be listed there:
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
The Silicon Labs Thunderboard Sense 2 is a complete development board with a Cortex-M4 microcontroller, a wide variety of sensors, a microphone, Bluetooth Low Energy and a battery holder - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio - and even stream your machine learning results over BLE to a phone. It's available for around 20 USD directly from Silicon Labs.
The Edge Impulse firmware for this development board is open source and hosted on on GitHub: edgeimpulse/firmware-silabs-thunderboard-sense-2.
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
See the Installation and troubleshooting guide.
With all the software in place it's time to connect the development board to Edge Impulse.
Use a micro-USB cable to connect the development board to your computer. The development board should mount as a USB mass-storage device (like a USB flash drive), with the name TB004
. Make sure you can see this drive.
The development board does not come with the right firmware yet. To update the firmware:
Drag the silabs-thunderboard-sense2.bin
file to the TB004
drive.
Wait 30 seconds.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
Did you know? You can also stream the results of your impulse over BLE to a nearby phone or gateway: see Streaming results over BLE to your phone.
When dragging and dropping an Edge Impulse pre-built .bin firmware file, the binary seems to flash, but when the device reconnects a FAIL.TXT file appears with the contents "Error while connecting to CPU" and the following errors appear from the Edge Impulse CLI impulse runner:
To fix this error, install the Simplicity Studio 5 IDE and flash the binary through the IDE's built in "Upload application..." menu under "Debug Adapters", and select your Edge Impulse firmware to flash:
Your Edge Impulse inferencing application should then run successfully with edge-impulse-run-impulse
.
The Silicon Labs xG24 Dev Kit (xG24-DK2601B) is a compact, feature-packed development platform built for the EFR32MG24 Cortex-M33 microcontroller. It provides the fastest path to develop and prototype wireless IoT products. This development platform supports up to +10 dBm output power and includes support for the 20-bit ADC as well as the xG24's AI/ML hardware accelerator. The platform also features a wide variety of sensors, a microphone, Bluetooth Low Energy and a battery holder - and it's fully supported by Edge Impulse! You'll be able to sample raw data as well as build and deploy trained machine learning models directly from the Edge Impulse Studio - and even stream your machine learning results over BLE to a phone.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: edgeimpulse/firmware-silabs-xg24.
To set this device up with Edge Impulse, you will need to install the following software:
Simplicity Commander. A utility program we will use to flash firmware images onto the target.
The Edge Impulse CLI which will enable you to connect your xG24 Dev Kit directly to Edge Impulse Studio, so that you can collect raw data and trigger in-system inferences.
Problems installing the CLI?
See the Installation and troubleshooting guide.
Edge Impulse Studio can collect data directly from your xG24 Dev Kit and also help you trigger in-system inferences to debug your model, but in order to allow Edge Impulse Studio to interact with your xG24 Dev Kit you first need to flash it with our base firmware image.
Download the latest Edge Impulse firmware, and unzip the file. Once downloaded, unzip it to obtain the firmware-xg24.hex
file, which we will be using in the following steps.
Use a micro-USB cable to connect the xG24 Dev Kit to your development computer (where you downloaded and installed Simplicity Commander).
You can use Simplicity Commander to flash your xG24 Dev Kit with our base firmware image. To do this, first select your board from the dropdown list on the top left corner:
Then go to the "Flash" section on the left sidebar, and select the base firmware image file you downloaded in the first step above (i.e., the file named firmware-xg24.hex
). You can now press the Flash
button to load the base firmware image onto the xG24 Dev Kit.
Keep Simplicity Commander Handy
Simplicity Commander will be needed to upload any other project built on Edge Impulse, but the base firmware image only has to be loaded once.
With all the software in place, it's time to connect the xG24 Dev Kit to Edge Impulse.
Use a micro-USB cable to connect the development board to your computer.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices on the left sidebar. The device will be listed there:
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
Grove - Vision AI Module is a thumb-sized board based on Himax HX6537-A processor which is equipped with a 2-Megapixel OV2640 camera, microphone, 3-axis accelerometer and 3-axis gyroscope. It offers storage with 32 MB SPI flash, comes pre-installed with ML algorithms for face recognition and people detection and supports customized models as well. It is compatible with the XIAO ecosystem and Arduino, all of which makes it perfect for getting started with AI-powered camera projects!
It is fully supported by Edge Impulse which means you will be able to sample raw data from the camera, build models, and deploy trained machine learning models to the module directly from the studio without any programming required. Grove - Vision AI Module is available for purchase directly from Seeed Studio Bazaar.
Quick links access:
Firmware source code: GitHub repository
Pre-compiled firmware: seeed-grove-vision-ai.zip
To set this board up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Download the latest Bouffalo Lab Dev Cube
Problems installing the Edge Impulse CLI?
See the Installation and troubleshooting guide.
With all the software in place it's time to connect the board to Edge Impulse.
BL702 is the USB-UART chip which enables the communication between the PC and the Himax chip. You need to update this firmware in order for the Edge Impulse firmware to work properly.
Download BL702-firmware-grove-vision-ai.zip and extract it to obtain tinyuf2-grove_vision_ai.bin file
Connect the board to the PC via a USB Type-C cable while holding down the Boot button on the board
Open previously installed Bouffalo Lab Dev Cube software, select BL702/704/706, and then click Finish
Go to MCU tab. Under Image file, click Browse and select the firmware you just downloaded.
Click Refresh, choose the Port related to the connected board, set Chip Erase to True, click Open UART, click Create & Download and wait for the process to be completed .
You will see the output as All Success if it went well.
Note: If the flashing throws an error, try to click Create & Download multiple times until you see the All Success message.
The board does not come with the right Edge Impulse firmware yet. To update the firmware:
Download the latest Edge Impulse firmware and extract it to obtain firmware.uf2 file
Connect the board again to the PC via USB Type-C cable and double-click the Boot button on the board to enter mass storage mode
After this you will see a new storage drive shown on your file explorer as GROVEAI. Drag and drop the firmware.uf2 file to GROVEAI drive
Once the copying is finished GROVEAI drive will disappear. This is how we can check whether the copying is successful or not.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build and run your first machine learning model with these tutorials:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
After building the machine learning model and downloading the Edge Impulse firmware from Edge Impulse Studio, deploy the model uf2 to Grove - Vision AI by following steps 1 and 2 under Update Edge Impulse firmware section.
If you want to compile the Edge Impulse firmware from the source code, you can visit this GitHub repo and follow the instructions included in the README.
The model used for the official firmware can be found in this public project.
Sony's Spresense is a small, but powerful development board with a 6 core Cortex-M4F microcontroller and integrated GPS, and a wide variety of add-on modules including an extension board with headphone jack, SD card slot and microphone pins, a camera board, a sensor board with accelerometer, pressure, and geomagnetism sensors, and Wi-Fi board - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio.
To get started with the Sony Spresense and Edge Impulse you'll need:
The Spresense main development board - available for around 55 USD from a wide range of distributors.
The Spresense extension board - to connect external sensors.
A micro-SD card to store samples.
In addition you'll want some sensors, these ones are fully supported (note that you can collect data from any sensor on the Spresense with the data forwarder):
For image models: the Spresense CXD5602PWBCAM1 camera add-on.
For accelerometer models: the Spresense Sensor EVK-70 add-on.
For audio models: an electret microphone and a 2.2K Ohm resistor, wired to the extension board's audio channel A, following this schema (picture here).
Note: for audio models you must also have a FAT formatted SD card for the extension board, with the Spresense's DSP files included in a BIN
folder on the card, see instructions here and a screenshot of the SD card directory here.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: edgeimpulse/firmware-sony-spresense.
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
See the Installation and troubleshooting guide.
With all the software in place it's time to connect the development board to Edge Impulse.
Make sure the SD card is formatted as FAT before inserting it into the Spresense.
Use a micro-USB cable to connect the main development board (not the extension board) to your computer.
The development board does not come with the right firmware yet. To update the firmware:
Install Python 3.7 or higher.
Download the latest Edge Impulse firmware, and unzip the file.
Open the flash script for your operating system (flash_windows.bat
, flash_mac.command
or flash_linux.sh
) to flash the firmware.
Wait until flashing is complete. The on-board LEDs should stop blinking to indicate that the new firmware is running.
From a command prompt or terminal, run:
Mac: Device choice
If you have a choice of serial ports and are not sure which one to use, pick /dev/tty.SLAB_USBtoUART or /dev/cu.usbserial-*
This will start a wizard which will ask you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
If you see:
Upgrade pyserial:
If the edge-impulse-daemon
or edge-impulse-run-impulse
commands do not start it might be because of an error interacting with the SD card or because your board has an old version of the bootloader. To see the debug logs, run:
And press the RESET button on the board. If you see Welcome to nash
you'll need to update the bootloader. To do so:
Install and launch the Arduino IDE.
Go to Preferences and under 'Additional Boards Manager URLs' add https://github.com/sonydevworld/spresense-arduino-compatible/releases/download/generic/package_spresense_index.json
(if there's already text in this text box, add a ,
before adding the new URL).
Then go to Tools > Boards > Board manager, search for 'Spresense' and click Install.
Select the right board via: Tools > Boards > Spresense boards > Spresense.
Select your serial port via: Tools > Port and selecting the serial port for the Spresense board.
Select the Spresense programmer via: Tools > Programmer > Spresense firmware updater.
Update the bootloader via Tools > Burn bootloader.
Then update the firmware again (from step 3: Update the bootloader and the firmware).
Seeed SenseCAP A1101 - LoraWAN Vision AI Sensor is an image recognition AI sensor designed for developers. SenseCAP A1101 - LoRaWAN Vision AI Sensor combines TinyML AI technology and LoRaWAN long-range transmission to enable a low-power, high-performance AI device solution for both indoor and outdoor use.
This sensor features Himax high-performance, low-power AI vision solution which supports the Google TensorFlow Lite framework and multiple TinyML AI platforms.
It is fully supported by Edge Impulse which means you will be able to sample raw data from the camera, build models, and deploy trained machine learning models to the module directly from the studio without any programming required. SenseCAP - Vision AI Module is available for purchase directly from Seeed Studio Bazaar.
To set A1101 up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen.
Download the latest Bouffalo Lab Dev Cube-All-Platform
Problems installing the Edge Impulse CLI?
See the Installation and troubleshooting guide.
With all the software in place, it's time to connect the A1101 to Edge Impulse.
BL702 is the USB-UART chip which enables the communication between the PC and the Himax chip. You need to update this firmware in order for the Edge Impulse firmware to work properly.
Get the latest bootloader firmware (tinyuf2-sensecap_vision_ai_xxx.bin.)
Connect the A1101 to the PC via a USB Type-C cable while holding down the Boot button on the A1101.
Open previously installed Bouffalo Lab Dev Cube software, select BL702/704/706, and then click Finish
Go to the MCU tab. Under Image file, click Browse and select the firmware you just downloaded.
Click Refresh, choose the Port related to the connected A1101, set Chip Erase to True, click Open UART, click Create & Download and wait for the process to be completed .
You will see the output as All Success if it went well.
If the flashing throws an error, click Create & Download multiple times until you see the All Success message.
A1101 does not come with the right Edge Impulse firmware yet. To update the firmware:
Download the latest Edge Impulse firmware and extract it to obtain firmware.uf2 file
Connect the A1101 again to the PC via USB Type-C cable and double-click the Boot button on the A1101 to enter mass storage mode
After this you will see a new storage drive shown on your file explorer as SENSECAP. Drag and drop the firmware.uf2 file to SENSECAP drive
Once the copying is finished SENSECAP drive will disappear. This is how we can check whether the copying is successful or not.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your A1101, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
Device connected to Edge Impulse correctly!
With everything set up, you can now build and run your first machine learning model with these tutorials:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
Frames from the onboard camera can be directly captured from the studio:
Finally, once a model is trained, it can be easily deployed to the A1101 – Vision AI Module to start inferencing!
After building the machine learning model and downloading the Edge Impulse firmware from Edge Impulse Studio, deploy the model uf2 to SenseCAP - Vision AI by following steps 1 and 2 under Update Edge Impulse firmware
Drag and drop the firmware.uf2 file from EDGE IMPULSE to SENSECAP drive.
When you run this on your local interface:
it will ask you to click a URL, then you will see a live preview of the camera on your device.
If you want to compile the Edge Impulse firmware from the source code, you can visit this GitHub repo and follow the instructions included in the README.
The model used for the official firmware can be found in this public project.
In addition to connecting directly to a computer to view real-time detection data, you can also transmit these data through LoraWAN® and finally upload them to the SenseCAP cloud platform or a third-party cloud platform. On the SenseCAP cloud platform, you can view the data in a cycle and display it graphically through your mobile phone or computer. The SenseCAP cloud platform and SenseCAP Mate App use the same account system.
Since our focus here is on describing the model training process, we won't go into the details of the cloud platform data display. But if you're interested, you can always visit the SenseCAP cloud platform to try adding devices and viewing data. It's a great way to get a better understanding of the platform's capabilities!
You can get more information on how to use SenseCAP A1101 here
LoRaWAN® network coverage is required when using sensors, there are two options.
Seeed provides:
SenseCAP M2 for Helium network
SenseCAP M2 Multi-Platform for standard LoraWAN® network
If you have interests, please kindly click for more details.
Download SenseCAP Mate
Open SenseCAP Mate and login
Under Config screen, select Vision AI Sensor
Press and hold the configuration button on the SenseCap A1101 for 3 seconds to enter bluetooth pairing mode
Click Setup and it will start scanning for nearby SenseCAP A1101 devices- Go to Settings and make sure Object Detection and User Defined 1 is selected. If not, select it and click Send
Go to General and click Detect, you'll see the actual data here.
Click here to open a preview window of the camera stream
Click Connect button. Then you will see a pop up on the browser. Select SenseCAP Vision AI - Paired and click Connect
View real-time inference results using the preview window!
The cats are detected with bounding boxes around them. Here "0" corresponds to each detection of the same class. If you have multiple classes, they will be named as 0, 1, 2, 3, 4
and so on. Also the confidence score for each detected object (0.72 in above demo) is displayed!
The Synaptics Katana KA10000 board is a low-power AI evaluation kit from Synaptics that has the KA10000 AI Neural Network processor onboard. The evaluation kit is provided with a separate Himax HM01B0 QVGA monochrome camera module and 2 onboard zero power Vesper microphones. The board has an embedded STLIS2Dw12 accelerometer and an optional TI OPT3001 ambient light sensor. The connectivity to the board is provided with an IEEE 802.11n ultra low power WiFi module that is integrated with a Bluetooth 5.x, in addition to 4 Peripheral Modules (PMOD) connectors to provide I2C. UART, GPIO, I2S/SPI interfaces.
The package contains several accessories:
The Himax image sensor.
The PMOD-I2C USB firmware configuration board.
The PMOD-UART USB adapter.
2 AAA batteries
Enclosure.
The Edge Impulse firmware for this board is open source and hosted on GitHub: .
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
In order to update the firmware, it is necessary to use the PMOD-I2C USB firmware configuration board. The PMOD-I2C board is connected to the Katana board on the north right PMOD-I2C interface (as shown in the image at the top of this page), then you need to use a USB C cable to connect the firmware configuration board to the host PC.
In addition to the PMOD-I2C configuration board. You need to connect the PMOD-UART extension to the Katana board which is located on the left side of the board. Then you need to use a Micro-USB cable to connect the board to your computer.
The board is shipped originally with a sound detection firmware by default. You can upload new firmware to the flash memory by following these instructions:
Verify that you have correctly connected the firmware configuration board.
Run the flash script for your operating system (flash_windows.bat
, flash_mac.command
or flash_linux.sh
) to flash the firmware.
Wait until flashing is complete.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
With everything set up you can now build your first machine learning model with these tutorials, and board-specific public projects:
The ST IoT Discovery Kit (also known as the B-L475E-IOT01A) is a development board with a Cortex-M4 microcontroller, MEMS motion sensors, a microphone and WiFi - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio. It's available for around 50 USD from a variety of distributors including .
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
Two variants of this board
There are two variants of this board, the B-L475E-IOT01A1 (US region) and the B-L475E-IOT01A2 (EU region) - the only difference is the sub-GHz radio. Both are usable in Edge Impulse.
To set this device up in Edge Impulse, you will need to install the following software:
On Windows:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?"
With all the software in place it's time to connect the development board to Edge Impulse.
Use a micro-USB cable to connect the development board to your computer. There are two USB ports on the development board, use the one the furthest from the buttons.
The development board does not come with the right firmware yet. To update the firmware:
The development board is mounted as a USB mass-storage device (like a USB flash drive), with the name DIS_L4IOT
. Make sure you can see this drive.
Drag the DISCO-L475VG-IOT01A.bin
file to the DIS_L4IOT
drive.
Wait until the LED stops flashing red and green.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, choose an Edge Impulse project, and set up your WiFi network. If you want to switch projects run the command with --clean
.
With everything set up you can now build your first machine learning model with these tutorials:
If you experience the following error when attempting to connect to a WiFi network:
If the LED does not flash red and green when you copy the .bin
file to the device and instead is a solid red color, and you are unable to connect the device with Edge Impulse, there may be an issue with your device's native firmware.
To restore functionality, use the following tool from ST to update your board to the latest version:
You might need to set up udev rules on Linux before being able to talk to the device. Create a file named /etc/udev/rules.d/50-stlink.rules
and add the following content:
Then unplug the development board and plug it back in.
The TinyML Board is a with a microphone and accelerometer, USB host microcontroller and an always-on Neural Decision Processor™, featuring ultra low-power consumption, a fully connected neural network architecture, and fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained embedded machine learning models directly from the Edge Impulse studio to create the next generation of low-power, high-performance audio interfaces.
The Edge Impulse firmware for this development board is open source and hosted on .
IMU data acquisition - SD Card
An SD Card is required to use IMU data acquisition as the internal RAM of the MCU is too small. You don't need the SD Card for inferencing only or for audio projects.
To set this device up in Edge Impulse, you will need to install the following software:
Select one of the 2 firmwares below for audio or IMU projects:
Insert SD Card if you need IMU data acquisition and connect the USB cable to your computer. Double-click on the script for your OS. The script will flash the Arduino firmware and a default model on the NDP101 chip.
Flashing issues
0x000000: read 0x04 != expected 0x01
Some flashing issues can occur on the Serial Flash. In this case, open a Serial Terminal on the TinyML board and send the command: :F. This will erase the Serial Flash and should fix the flashing issue.
Connect the Syntiant TinyML Board directly to your computer's USB port. Linux, Mac OS, and Windows 10 platforms are supported.
Audio - USB microphone (macOS/Linux only)
Check that the Syntiant TinyML enumerates as "TinyML" or "Arduino MKRZero". For example, in Mac OS you'll find it under System Preferences/Sound:
Audio acquisition - Windows OS
Using the Syntiant TinyML board as an external microphone for data collection doesn't currently work on Windows OS.
IMU
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
With everything set up you can now build your first machine learning model and evaluate it using the Syntiant TinyML Board with this tutorial:
How to label my classes? The NDP101 chip expects one and only negative class and it should be the last in the list. For instance, if your original dataset looks like: yes, no, unknown, noise
and you only want to detect the keyword 'yes' and 'no', merge the 'unknown' and 'noise' labels in a single class such as z_openset
(we prefix it with 'z' in order to get this class last in the list).
The is the debut microcontroller from Raspberry Pi - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio. It's available for around $4 from Raspberry Pi foundation and a wide range of distributors.
To get started with the Raspberry Pi RP2040 and Edge Impulse you'll need:
A . The pre-built firmware and Edge Impulse Studio exported binary are tailored for , but with a few simple steps you can collect the data and run your models with other RP2040-based boards, such as . For more details, check out .
(Optional) If you are using the , the makes it easier to connect external sensors for data collection/inference.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
To set this device up in Edge Impulse, you will need to install the following software:
If you'd like to interact with the board using a set of pre-defined AT commands (not necessary for standard ML workflow), you will need to also install a serial communication program, for example minicom
, picocom
or use Serial Monitor from Arduino IDE (if installed).
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
With all the software in place, it's time to connect the development board to Edge Impulse.
Use a micro-USB cable to connect the development board to your computer while holding down the BOOTSEL button, forcing the Raspberry Pi Pico into USB Mass Storage Mode.
The development board does not come with the right firmware yet. To update the firmware:
Drag the ei_rp2040_firmware.uf2
file from the folder to the USB Mass Storage device.
Wait until flashing is complete, unplug and replug in your board to launch the new firmware.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
With everything set up you can now build your first machine learning model. Since Raspberry Pi Pico does not have any built-in sensors, we decided to add the following ones to be supported out of the box, with a pre-built firmware:
Analog signal sensor (pin A0).
Once you have the compatible sensors, you can then follow these tutorials:
Support for Arduino RP2040 Connect was added to the official RP2040 firmware for Edge Impulse. That includes data acquisition and model inference support for:
onboard MP34DT05 microphone
onboard ST LSM6DSOX 6-axis IMU
the sensors described above still can be connected
While RP2040 is a relatively new microcontroller, it was already utilized to build several boards:
The official Raspberry Pi Pico RP2040
Arducam Pico4ML (Camera, screen and microphone)
Seeed Studio XIAO RP2040 (extremely small footprint)
Black Adafruit Feather RP2040 (built-in LiPoly charger)
And others. While pre-built Edge Impulse firmware is mainly tested with Pico board, it is compatible with other boards, with the exception of I2C sensors and microphone - different boards use different pins for peripherals, so if you’d like to use LSM6DS3/LSM6DSOX accelerometer & gyroscope modules or microphone, you will need to change pin values in Edge Impulse RP2040 firmware source code, recompile it and upload it to the board.
The is a development board equipped with the multiprotocol wireless CC1352P microcontroller. The Launchpad, when paired with the and booster packs, is fully supported by Edge Impulse, and is able to sample accelerometer & microphone data, build models, and deploy directly to the device without any programming required. The , , and boards are available for purchase directly from Texas Instruments.
If you don't have either booster pack or are using different sensing hardware, you can use the to capture data from any other sensor type, and then follow the tutorial to run your impulse. Or, you can clone and modify the open source project on GitHub.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
To set this device up in Edge Impulse, you will need to install the following software:
Add the installation directory to your PATH
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the Edge Impulse CLI?
With all the software in place it's time to connect the development board to Edge Impulse.
To interface the Launchpad with sensor hardware, you will need to either connect the BOOSTXL-SENSORS to collect accelerometer data, or the CC3200AUDBOOST to collect audio data. Follow the guides below based on what data you want to collect.
Before you start
2. Connect the development board to your computer
Use a micro-USB cable to connect the development board to your computer.
3. Update the firmware
The development board does not come with the right firmware yet. To update the firmware:
Open the flash script for your operating system (flash_windows.bat
, flash_mac.command
or flash_linux.sh
) to flash the firmware.
Wait until flashing is complete, and press the RESET button once to launch the new firmware.
Problems flashing firmware onto the Launchpad?
3. Setting keys
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Which device do you want to connect to?
The Launchpad enumerates two serial ports. The first is the Application/User UART, which the edge-impulse firmware communicates through. The other is an Auxiliary Data Port, which is unused.
When running the edge-impulse-daemon
you will be prompted on which serial port to connect to. On Mac & Linux, this will appear as:
Generally, select the lower numbered serial port. This usually corresponds with the Application/User UART. On Windows, the serial port may also be verified in the Device Manager
4. Verifying that the device is connected
With everything set up you can now build and run your first machine learning model with these tutorials:
Failed to flash
If the UniFlash CLI is not added to your PATH, the install scripts will fail. To fix this, add the installation directory of UniFlash (example /Applications/ti/uniflash_6.4.0
on macOS) to your PATH on:
If during flashing you encounter further issues, ensure:
The device is properly connected and/or the cable is not damaged.
You have the proper permissions to access the USB device and run scripts. On macOS you can manually approve blocked scripts via System Preferences->Security Settings->Unlock Icon
If on Linux you may want to try copying tools/71-ti-permissions.rules to /etc/udev/rules.d/. Then re-attach the USB cable and try again.
.
, and unzip the file.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
Eggs AI:
Tutorial: Adding sight to your sensors (Synaptics KA10000):
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
).
- drivers for the development board. Run dpinst_amd64
on 64-bits Windows, or dpinst_x86
on 32-bits Windows.
See the guide.
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
.
.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
You have hit a with the firmware for this development board's WiFi module that results in a timeout during network scanning if there are more than 20 WiFi access points detected. If you are experiencing this issue, you can work around it by attempting to reduce the number of access points within range of the device, or by skipping WiFi configuration.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
How to use Arduino-CLI with macOS M1 chip? You will need to install Rosetta2 to run the Arduino-CLI. See details on .
.
See the guide.
, and unzip the file.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
(GP16; pin D16 on Grove Shield for Pi Pico).
(GP18; pin D18 on Grove Shield for Pi Pico).
(I2C0).
There is a vast variety of analog signal sensors, that can take advantage of RP2040 10-bit ADC (Analog to Digital Converter), from common ones, such as Light sensor, Sound level sensor to more specialized ones, e.g. , or even an .
.
.
.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
.
Install the desktop version for your operating system
See for more details
See the guide.
The Launchpad jumper connections should be in their original configuration out of the box. If you have already modified the jumper connections, see the Launchpad's for the original configuration.
You will need five extra to connect the CC3200AUDBOOST to the Launchpad, as described in the .
The CC3200AUDBOOST board requires modifications to interface properly with the CC1352P series of Launchpads. The full documentation regarding these modifications is available from Texas Instruments in their , and a summary of the steps to configure the board are shown below.
The pin connections shown below are required by TI to interface between the two boards. Connect the pins by using jumper wires and following the diagram. For more information see the CC3200AUDBOOST and
Perform all modifications to the Launchpad and audio booster pack described in the
, and unzip the file.
See the section for more information.
If a selected serial port fails to connect. Test the other port before checking for other common issues.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse, and you can with custom firmware or sensor data.
Alternatively, the gcc/build/edge-impulse-standalone.out
binary file may be flashed to the Launchpad using the UniFlash GUI or web-app. See the for more info.