Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Community board
This is a community board by Arducam, and it's not maintained by Edge Impulse. For support head to the Arducam support page.
The Arducam Pico4ML TinyML Dev Kit is a development board from Arducam with a RP2040 microcontroller, QVGA camera, bluetooth module (depending on your version), LCD screen, onboard microphone, accelerometer, gyroscope, and compass. Arducam has created in depth tutorials on how to get started using the Pico4ML Dev Kit with Edge Impulse, including how to collect new data and how to train and deploy your Edge Impulse models to the Pico4ML. The Arducam Pico4ML TinyML Dev Kit has two versions, the version with BLE and the version without BLE.
To set up your Arducam Pico4ML TinyML Dev Kit, follow this guide: Arducam: How to use Edge Impulse to train machine learning models for Raspberry Pico.
With everything set up you can now build your first machine learning model with the Edge Impulse continuous motion recognition tutorial.
Or you can follow Arducam's tutorial on How to build a Magic Wand with Edge Impulse for Arducam Pico4ML-BLE.
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
With the impulse designed, trained and verified you can deploy this model back to your Arducam Pico4ML TinyML Dev Kit. This makes the model run without an internet connection, minimizes latency, and runs with minimum power consumption. Edge Impulse can package the complete impulse - including the signal processing code, neural network weights, and classification code - up into a single library that you can run on your development board. See the end of the Arducam's How to use Edge Impulse to train machine learning models for Raspberry Pico tutorial for more information on deploying your model onto the device.
Ambiq®, the leader in low-power System-on-Chip (SoC) solutions, has once again raised the bar with the Apollo4 Family of SoCs. With the lowest dynamic and sleep mode power on the market, the Apollo4 allows designers of next-generation edge AI devices to take their innovative products to the next level. Built upon Ambiq’s proprietary Subthreshold Power-Optimized Technology (SPOT®) platform, the Apollo4 Family of SoCs is a complete hardware and software solution that enables the battery-powered endpoint devices of tomorrow to achieve a higher level of intelligence without sacrificing battery life. Edge Impulse support is available on the Apollo4 Plus, Apollo4 Blue Plus, Apollo4 Lite, and Apollo4 Blue Lite. See below for how to get started on the Apollo4 Plus and Apollo4 Blue Plus SoCs.
The Apollo4 Plus is built on the 32-bit Arm® Cortex®-M4 core with Floating Point Unit (FPU). With up to 2 MB of NVM and 2.75 MB of SRAM, the Apollo4 Plus has more than enough compute and storage to handle complex algorithms and neural networks while displaying vibrant, crystal clear, and smooth graphics. If additional memory is required, an external memory is supported through Ambiq’s multi-bit SPI and eMMC interfaces. The Apollo4 Plus is purpose-built to serve as both an application processor and a co-processor for battery-powered endpoint devices, including predictive health and maintenance sensors, smart home devices, livestock trackers, wrist-based wearables, smart rings, smart voice devices, and more. The Apollo4 Plus is available now in BGA packaging. The Apollo4 Blue Plus incorporates an optional BLE 5.4 radio.
To set this device up in Edge Impulse, you will need to install the following software:
Problems installing the CLI?
See the Installation and troubleshooting guide.
This step is only needed when using models requiring microphone input, such as the example below. Skip this section if you are testing other models that do not need audio input.
Remove the pin header protectors on the Apollo4 Plus EVB (AMAP4PEVB) or Apollo4 Blue Plus KXR EVB (AMAP4BPXEVB), and carefully plug the Apollo4 Audio Add-onBoard (AMA4AUD) into the development board. Pay special attention to the highspeed connector with film tape, which must be removed before connecting the boards. Place the digital and analog microphone modules onto the shield, as shown in the image below, and connect cables to both type-C connectors on the evaluation board.
This step is only needed when using models requiring camera input. Skip this section if you are testing other models that do not need camera input.
The ArduCam Mega 5MP SPI connects to the Apollo4 Plus EVB and Apollo4 Blue Plus KXR EVB pins as shown in the table below:
The wiring harness provided with the camera can be sensitive, so pin jumpers or another wiring harness may help
Pre-built image with only audio support and "Hello World" detector example here
Pre-built image with full video and audio data collection and FOMO Face detector example here
Get started by extracting the archive and choose the appropriate script for your system architecture to flash the firmware:
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, you can access the project API Key as shown below by navigating to the Dashboard section on the left pane of your Studio project and select the Keys tab, then click the copy/paste icon next to the API Key to copy the entire text to your clipboard, then run:
Run the edge-impulse-daemon
and connect to your project, you will be prompted to name your device:
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
Audio
With the device connected to Studio, you can use it to collect audio data up to 5 seconds in length for training and testing your model. Navigate to the Data acquisition tab and start collecting samples:
Daemon output during sampling:
Video
Sampling images:
Three supported sizes 96x96, 128x128, 160x160:
With everything set up you can now build your first machine learning model with these tutorials:
If you flashed the pre-built binary with FOMO Face Detection example from above then just connect your board and run edge-impulse-run-impulse
to see an output similar to this:
Start by going to your Studio projects then create a new project and navigate to the Create impulse
section of Impulse design
, at which point you will be prompted to select your target, choose the Apollo4 Plus:
Then add the DSP block:
Then the keyword spotting learn block:
And finally save the impulse:
Now select the DSP block:
And go to Generate features
:
Click the button and wait for the job to finish, when it does you'll see something like this:
Select the learning block:
Then click Save & train
and you'll eventually see an output like this:
Go to the Model testing
section and enable int8 testing:
And run the test:
Navigate to the Deployment
section and choose the Apollo4 Blue Plus:
Now click Build
and wait for the job to finish, when it does a zip archive will be downloaded to your computer:
See the previous section on flashing the board.
You can run your impulse by using edge-impulse-run-impulse
:
If you have problems with the flashing script make sure you are using USB cables with data and not just power-only cables.
Reach out to us on the forum and have fun making machine learning models on the Apollo4 Family of SoCs from Ambiq!
The Portenta H7 is a powerful development board from Arduino with both a Cortex-M7 microcontroller and a Cortex-M4 microcontroller, a BLE/WiFi radio, and an extension slot to connect the Portenta vision shield - which adds a camera and dual microphones. The Portenta H7 and the vision shield are available directly from for ~$150 in total.
There are two versions of the vision shield: one that has an Ethernet connection and one with a LoRa radio. Both of these can be used with Edge Impulse.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
With all the software in place it's time to connect the development board to Edge Impulse.
Using the vision shield using two edge connectors on the back Portenta H7.
Use a USB-C cable to connect the development board to your computer. Then, double-tap the RESET button to put the device into bootloader mode. You should see the green LED on the front pulsating.
The development board does not come with the right firmware yet. To update the firmware:
Double press on the RESET button on your board to put it in the bootloader mode.
Open the flash script for your operating system (flash_windows.bat
, flash_mac.command
or flash_linux.sh
) to flash the firmware.
Wait until flashing is complete, and press the RESET button once to launch the new firmware.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
With everything set up you can now build your first machine learning model with these tutorials:
Download your custom firmware from the Deployment tab in the Studio and install the firmware with the same method as in the "Update the firmware" section and run the edge-impulse-run-impulse
command:
Note that it may take up to 10 minutes to compile the firmware for the Arduino Portenta H7
If you come across this issue:
You probably forgot to double press the RESET button before running the flash script.
Camera Pin | EVB Pin |
---|---|
.
.
Here's an .
The has instructions for macOS and Linux.
See the guide.
, and unzip the file.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
Use the tutorial and select one of the portenta examples:
For an end-to-end example that classifies data and then sends the result over LoRaWAN. Please see the example.
GND
Any EVB GND
5V/VDD
Any EVB 5V
SCK
Pin 8
MISO
Pin 10
MOSI
Pin 9
CS
Pin 11
The Arduino Nano 33 BLE Sense is a tiny development board with a Cortex-M4 microcontroller, motion sensors, a microphone and BLE - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio.
You can also use the Arduino Tiny Machine Learning Kit to run image classification models on the edge with the Arduino Nano and attached OV7675 camera module (or connect the hardware together via jumper wire and a breadboard if purchased separately).
Different Arduino Nano 33 BLE Sense Versions
Arduino has two different versions (known as "revisions") of the Arduino Nano 33 BLE Sense. Both use the nRF52840 as the processor, but the sensors are different. While the Edge Impulse firmware works with both versions, you need to be careful about choosing the correct version when working with the Arduino IDE.
You can tell which version of the Arduino Nano 33 BLE Sense you have by looking at the underside of the board. The first version will simply have NANO 33 BLE SENSE written in the silkscreen. The second version will have NANO 33 BLE SENSE REV2.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: edgeimpulse/firmware-arduino-nano-33-ble-sense.
To set this device up in Edge Impulse, you will need to install the following software:
Here's an instruction video for Windows.
The Arduino website has instructions for macOS and Linux.
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
See the Installation and troubleshooting guide.
With all the software in place it's time to connect the development board to Edge Impulse.
Use a micro-USB cable to connect the development board to your computer. Then press RESET twice to launch into the bootloader. The on-board LED should start pulsating to indicate this.
The development board does not come with the right firmware yet. To update the firmware:
Download the latest Edge Impulse firmware, and unzip the file.
Open the flash script for your operating system (flash_windows.bat
, flash_mac.command
or flash_linux.sh
) to flash the firmware.
Wait until flashing is complete, and press the RESET button once to launch the new firmware.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
One option to deploy your model is to use the Arduino library option on the Deployment page in your Edge Impulse Studio project. This will combine your model with your chosen processing block and automatically download an Arduino in a .zip file. In the Arduino IDE, select Sketch > Include Library > Add .ZIP Library... and select your downloaded .zip file.
Once the library finishes installing, you can select File > Examples > <name_of_your_project>_inferencing to see a list of available Arduino examples for the various supported boards. Notice that you have both Nano 33 Sense and Nano 33 Sense Rev2 options available.
The examples for camera, microphone, and microphone_fusion under nano_ble33_sense will work for both boards. You must choose the correct board revision (nano_ble33_sense or nano_ble33_sense_rev2) for the accelerometer, accelerometer_continuous, or fusion examples, as the accelerometer and environmental sensors are different between the board revisions.
These examples should give you a good starting place for developing your own edge ML applications on the Arduino. For example, if you train a keyword spotting model to identify the words "yes" and "no," you would deploy the model as an Arduino library and upload the nano_ble3_sense_microphone_continuous example to your Nano 33 BLE. Once uploaded, open the Serial Monitor to see the inference results printed out.
It probably means you don't have Rosetta 2 installed yet (which allows Intel-based apps to run on M1 chips).
The error looks like the following:
To install Rosetta 2 you can run this command:
You will need the following hardware:
Arduino Nano 33 BLE Sense board with headers.
OV7675 camera module.
Micro-USB cable.
Solderless breadboard and female-to-male jumper wires.
First, slot the Arduino Nano 33 BLE Sense board into a solderless breadboard:
With female-to-male jumper wire, use the following wiring diagram, pinout diagrams, and connection table to link the OV7675 camera module to the microcontroller board via the solderless breadboard:
Download the full pinout diagram of the Arduino Nano 33 BLE Sense here.
Finally, use a micro-USB cable to connect the Arduino Nano 33 BLE Sense development board to your computer.
Now build & train your own image classification model and deploy to the Arduino Nano 33 BLE Sense with Edge Impulse!
The Nicla Sense ME is a tiny, low-power tool that sets a new standard for intelligent sensing solutions. With the simplicity of integration and scalability of the Arduino ecosystem, the board combines four state-of-the-art sensors from Bosch Sensortec:
BHI260AP motion sensor system with integrated AI.
BMM150 magnetometer.
BMP390 pressure sensor.
BME688 4-in-1 gas sensor with AI and integrated high-linearity, as well as high-accuracy pressure, humidity and temperature sensors.
Designed to easily analyze motion and the surrounding environment – hence the “M” and “E” in the name – it measures rotation, acceleration, pressure, humidity, temperature, air quality and CO2 levels by introducing completely new Bosch Sensortec sensors on the market.
Its tiny size and robust design make it suitable for projects that need to combine sensor fusion and AI capabilities on the edge, thanks to a strong computational power and low-consumption combination that can even lead to standalone applications when battery-operated.
The Arduino Nicla Sense ME is available on the Arduino Store.
To set this device up in Edge Impulse, you will need to install the following software:
Here's an instruction video for Windows.
The Arduino website has instructions for macOS and Linux.
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
See the Installation and troubleshooting guide.
With all the software in place it's time to connect the development board to Edge Impulse.
Use a micro-USB cable to connect the development board to your computer.
The development board does not come with the right firmware yet. To update the firmware:
Open the nicla_sense_ingestion.ino
sketch in a text editor or the Arduino IDE.
For data ingestion into your Edge Impulse project, at the top of the file, select 1 or multiple sensors by un-commenting the defines and select a desired sample frequency (in Hz). For example, for the Environmental sensors:
Then, from your sketch's directory, run the Arduino CLI to compile:
Then flash to your Nicla Sense using the Arduino CLI:
Wait until flashing is complete, and press the RESET button once to launch the new firmware.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. You will also name your sensor's axes (depending on which sensor you selected in your compiled nicla_sense_ingestion.ino
sketch). If you want to switch projects/sensors run the command with --clean
. Please refer to the table below for the names used for each axis corresponding to the type of sensor:
Note: These exact axis names are required to run the Edge Impulse Arduino library deployment example applications for the Nicla Sense without any changes.
Else, when deploying the model, you will see an error like the following:
If your axis names are different, when using the generated Arduino Library for the inference, you can modify the eiSensors nicla_sensors[]
(near line 70) in the sketch example to add your custom names. e.g.:
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with the Edge Impulse continuous motion recognition tutorial.
Looking to connect different sensors? Use the nicla_sense_ingestion
sketch and the Edge Impulse Data forwarder to easily send data from any sensor on the Nicla Sense into your Edge Impulse project.
With the impulse designed, trained and verified you can deploy this model back to your Arduino Nicla Sense ME. This makes the model run without an internet connection, minimizes latency, and runs with minimum power consumption. Edge Impulse can package the complete impulse - including the signal processing code, neural network weights, and classification code - up into a single library that you can run on your development board.
Use the Running your impulse locally: On your Arduino tutorial and select one of the Nicla Sense examples.
The Nicla Vision is a ready-to-use, standalone camera for analyzing and processing images on the Edge. Thanks to its 2MP color camera, smart 6-axis motion sensor, integrated microphone, and distance sensor, it is suitable for asset tracking, object recognition, and predictive maintenance. Some of its key features include:
Powerful microcontroller equipped with a 2MP color camera
Tiny form factor of 22.86 x 22.86 mm
Integrated microphone, distance sensor, and intelligent 6-axis motion sensor
Onboard Wi-Fi and Bluetooth® Low Energy connectivity
Standalone when battery-powered
Expand existing projects with sensing capabilities
Enable fast Machine Vision prototyping
Compatible with Nicla, Portenta, and MKR products
Its exceptional capabilities are supported by a powerful STMicroelectronics STM32H747AII6 Dual ARM® Cortex® processor, combining an M7 core up to 480 Mhz and an M4 core up to 240 Mhz. Despite its industrial strength, it keeps energy consumption low for battery-powered standalone applications.
The Arduino Nicla Vision is available for around 95 EUR from the Arduino Store.
To set this device up in Edge Impulse, you will need to install the following software:
Here's an instruction video for Windows.
The Arduino website has instructions for macOS and Linux.
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
See the Installation and troubleshooting guide.
There are two ways to connect the Nicla Vision to Edge Impulse:
Using the official Edge Impulse firmware - it supports all onboard sensors, including camera.
Using an ingestion script. This supports analog, IMU, proximity sensors and microphone (limited to 8 kHz), but not the camera. It is only recommended if you want to modify the ingestion flow for third-party sensors.
Use a micro-USB cable to connect the development board to your computer. Under normal circumstances, flash process should work without entering the bootloader manually. However if run into difficulties flashing the board, you can enter the bootloader by pressing RESET twice. The onboard LED should start pulsating to indicate this.
The development board does not come with the right firmware yet. To update the firmware:
Download the latest Edge Impulse firmware, and unzip the file.
Open the flash script for your operating system (flash_windows.bat
, flash_mac.command
or flash_linux.sh
) to flash the firmware.
Wait until flashing is complete, and press the RESET button once to launch the new firmware.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
Use a micro-USB cable to connect the development board to your computer.
The development board does not come with the right firmware yet. To update the firmware:
Download the latest Edge Impulse ingestion sketches and unzip the file.
Open the nicla_vision_ingestion.ino
(for IMU/proximity sensor) or nicla_vision_ingestion_mic.ino
(for microphone) sketch in a text editor or the Arduino IDE.
For IMU/proximity sensor data ingestion into your Edge Impulse project, at the top of the file, select 1 or multiple sensors by un-commenting the defines and select the desired sample frequency (in Hz). For example, for the accelerometer sensor:
For microphone data ingestion, you do not need to change the default parameters in the nicla_vision_ingestion_mic.ino
sketch.
Then, from your sketch's directory, run the Arduino CLI to compile:
Then flash to your Nicla Vision using the Arduino CLI:
Alternatively if you open the sketch in the Arduino IDE, you can compile and upload the sketch from there.
Wait until flashing is complete, and press the RESET button once to launch the new firmware.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. You will also name your sensor's axes (depending on which sensor you selected in your compiled nicla_vision_ingestion.ino
sketch). If you want to switch projects/sensors run the command with --clean
. Please refer to the table below for the names used for each axis corresponding to the type of sensor:
Note: These exact axis names are required for the Edge Impulse Arduino library deployment example applications for the Nicla Vision.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in and choose an Edge Impulse project. You will also name your sensor axes - in the case of the microphone, you need to enter audio
. If you want to switch projects/sensors run the command with --clean
.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
The above screenshots are for Edge Impulse Ingestion scripts and Data forwarder. If you use the official Edge Impulse firmware for the Nicla Vision, the content will be slightly different.
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? Use the nicla_vision_ingestion.ino
sketch and the Edge Impulse data forwarder to easily send data from any sensor on the Nicla Vision into your Edge Impulse project.
With the impulse designed, trained and verified you can deploy this model back to your Arduino Nicla Vision. This makes the model run without an internet connection, minimizes latency, and runs with minimum power consumption. Edge Impulse can package the complete impulse - including the signal processing code, neural network weights, and classification code - up into a single library that you can run on your development board.
Use the Running your impulse locally: On your Arduino tutorial and select one of the Nicla Vision examples.
CY8CKIT-062S2 Pioneer Kit and CY8CKIT-028-SENSE expansion kit required
This guide assumes you have the attached to a
The Infineon Semiconductor enables the evaluation and development of applications using the PSoC 62 Series MCU. This low-cost hardware platform enables the design and debug of the PSoC 62 MCU and the Murata 1LV Module (CYW43012 Wi-Fi + Bluetooth Combo Chip). The PSoC 6 MCU is Infineon' latest, ultra-low-power PSoC specifically designed for wearables and IoT products. The board features a PSoC 6 MCU, and a CYW43012 Wi-Fi/Bluetooth combo module. Infineon CYW43012 is a 28nm, ultra-low-power device that supports single-stream, dual-band IEEE 802.11n-compliant Wi-Fi MAC/baseband/radio and Bluetooth 5.0 BR/EDR/LE. When paired with the , the PSoC® 62S2 Wi-Fi® BLUETOOTH® Pioneer Kit can be used to easily interface a variety of sensors with the PSoC™ 6 MCU platform, specifically targeted for audio and machine learning applications which are fully supported by Edge Impulse! You'll be able to sample raw data as well as build and deploy trained machine learning models to your PSoC® 62S2 Wi-Fi® BLUETOOTH® Pioneer Kit, directly from the Edge Impulse Studio.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
To set this device up with Edge Impulse, you will need to install the following software:
Problems installing the CLI?
Then select the base firmware image file you downloaded in the first step above (i.e., the file named firmware-infineon-cy8ckit-062s2.hex
). You can now press the Connect
button to connect to the board, and finally the Program
button to load the base firmware image onto the CY8CKIT-062S2 Pioneer Kit.
With all the software in place, it's time to connect the CY8CKIT-062S2 Pioneer Kit to Edge Impulse.
Use a micro-USB cable to connect the development board to your computer.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
With everything set up you can now build your first machine learning model with these tutorials:
CY8CKIT-062-BLE PSoC™ 6-BLE Pioneer Kit and CY8CKIT-028-EPD expansion kit required
This guide assumes you have the attached to a
The is a hardware platform that enables the evaluation and development of applications using the PSoC™ 63 MCU with AIROC™ Bluetooth® LE. The PSoC 6 BLE Pioneer Kit when paired along with the E-ink display shield board, , forms a powerful combination with its onboard sensors. The kit come with an onboard thermistor, 6-axis motion sensor, and a digital microphone. The PSoC 6 BLE Pioneer Kit baseboard also comes with 2 buttons, a 5-segment slider, and a proximity sensor based on CAPSENSE™ technology.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
To set this device up with Edge Impulse, you will need to install the following software:
Problems installing the CLI?
Then select the base firmware image file you downloaded in the first step above (i.e., the file named firmware-infineon-cy8ckit-062-ble.hex
). You can now press the Connect
button to connect to the board, and finally the Program
button to load the base firmware image onto the CY8CKIT-062S2 Pioneer Kit.
With all the software in place, it's time to connect the CY8CKIT-062S2 Pioneer Kit to Edge Impulse.
Use a micro-USB cable to connect the development board to your computer.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
To create an example project you must first open a new ModusToolBox application from the File menu
Then, you must choose which board support package (BSP) you wish to run your application on. Boards that are officially supported will have Edge Impulse examples.
Lastly, in the Project Creator window, you may select any Edge Impulse listings available for that product and click on Create. Please refer to the ModusToolBox help and tutorials for more information on running applications on your device.
With everything set up you can now build your first machine learning model with these tutorials:
Espressif ESP-EYE (ESP32) is a compact development board based on Espressif's ESP32 chip, equipped with a 2-Megapixel camera and a microphone. ESP-EYE also offers plenty of storage, with 8 MB PSRAM and 4 MB SPI flash - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio.
There are plenty of other boards built with ESP32 chip - and of course there are custom designs utilizing ESP32 SoM. Edge Impulse firmware was tested with ESP-EYE and ESP FireBeetle boards, but there is a possibility to modify the firmware to use it with other ESP32 designs. Read more on that in section of this documentation.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
To set this device up in Edge Impulse, you will need to install the following software:
Python 3.
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
With all the software in place it's time to connect the development board to Edge Impulse.
Use a micro-USB cable to connect the development board to your computer.
The development board does not come with the right firmware yet. To update the firmware:
Open the flash script for your operating system (flash_windows.bat
, flash_mac.command
or flash_linux.sh
) to flash the firmware.
Wait until flashing is complete.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
With everything set up you can now build your first machine learning model with these tutorials:
The standard firmware supports the following sensors:
Camera: OV2640, OV3660, OV5640 modules from Omnivision
Microphone: I2S microphone on ESP-EYE (MIC8-4X3-1P0)
LIS3DHTR module connected to I2C (SCL pin 22, SDA pin 21)
Any analog sensor, connected to A0
ESP32 is a very popular chip both in a community projects and in industry, due to its high performance, low price and large amount of documentation/support available. There are other camera enabled development boards based on ESP32, which can use Edge Impulse firmware after applying certain changes, e.g.
AI-Thinker ESP-CAM
M5STACK ESP32 PSRAM Timer Camera X (OV3660)
M5STACK ESP32 Camera Module Development Board (OV2640)
Additionally, since Edge Impulse firmware is open-source and available to public, if you have made modifications/added new sensors capabilities, we encourage you to make a PR in firmware repository!
To deploy your impulse on your ESP32 board, please see:
The Himax WE-I Plus is a tiny development board with a camera, a microphone, an accelerometer and a very fast DSP - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio. It's available on .
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
With all the software in place it's time to connect the development board to Edge Impulse.
Use a micro-USB cable to connect the development board to your computer.
The development board does not come with the right firmware yet. To update the firmware:
Open the flash script for your operating system (flash_windows.bat
, flash_mac.command
or flash_linux.sh
) to flash the firmware.
Wait until flashing is complete, and press the RESET button once to launch the new firmware.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
With everything set up you can now build your first machine learning model with these tutorials:
Community board
This is a community board by Blues Wireless, and is not maintained by Edge Impulse. For support head to the .
The Blues Wireless Swan is a development board featuring a 120MHz ARM Cortex-M4 from STMicroelectronics with 2MB of flash and 640KB of RAM. Blues Wireless has created an on how to get started using the Swan with Edge Impulse, including how to collect new data from a triple axis accelerometer and how to train and deploy your Edge Impulse models to the Swan. For more details and ordering information, visit the Blues Wireless Swan .
Idle (no motion)
Circle
Slash
An up-and-down motion in the shape of the letter "W"
With the impulse designed, trained and verified you can deploy this model back to your Blues Wireless Swan. This makes the model run without an internet connection, minimizes latency, and runs with minimum power consumption. Edge Impulse can package the complete impulse - including the signal processing code, neural network weights, and classification code - up into a single library that you can run on your development board. See the end of the Blues Wireless' [Using Swan with Edge Impulse] (https://dev.blues.io/swan/using-swan-with-edge-impulse) tutorial for more information on deploying your model onto the device.
The Nordic Semiconductor nRF52840 DK is a development board with a Cortex-M4 microcontroller, QSPI flash, and an integrated BLE radio - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio. As the nRF52840 DK does not have any built-in sensors we recommend you to pair this development board with the shield (with a MEMS accelerometer and a MEMS microphone).
If you don't have the X-NUCLEO-IKS02A1 shield you can use the to capture data from any other sensor, and then follow the tutorial to run your impulse. Or, you can modify the example firmware (based on nRF Connect) to interact with other accelerometers or PDM microphones that are supported by Zephyr.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
With all the software in place it's time to connect the development board to Edge Impulse.
Remove the pin header protectors on the nRF52840 DK and plug the X-NUCLEO-IKS02A1 shield into the development board.
Note: Make sure that the shield does not touch any of the pins in the middle of the development board. This might cause issues when flashing the board or running applications.
Use a micro-USB cable to connect the development board to your computer. There are two USB ports on the development board, use the one on the short side of the board. Then, set the power switch to 'on'.
The development board does not come with the right firmware yet. To update the firmware:
The development board is mounted as a USB mass-storage device (like a USB flash drive), with the name JLINK
. Make sure you can see this drive.
Drag the nrf52840-dk.bin
file to the JLINK
drive.
Wait 20 seconds and press the BOOT/RESET button.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
With everything set up you can now build your first machine learning model with these tutorials:
If you don't see the JLINK
drive show up when you connect your nRF52840 DK you'll have to update the interface firmware.
Set the power switch to 'off'.
Hold BOOT/RESET while you set the power switch to 'on'.
Your development board should be mounted as BOOTLOADER
.
After 20 seconds disconnect the USB cable, and plug the cable back in.
The development board should now be mounted as JLINK
.
If your board fails to flash new firmware (a FAIL.txt
file might appear on the JLINK
drive) you can also flash using nrfjprog
.
Flash new firmware via:
The Nordic Semiconductor nRF5340 DK is a development board with dual Cortex-M33 microcontrollers, QSPI flash, and an integrated BLE radio - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio. As the nRF5340 DK does not have any built-in sensors we recommend you to pair this development board with the shield (with a MEMS accelerometer and a MEMS microphone).
If you don't have the X-NUCLEO-IKS02A1 shield you can use the to capture data from any other sensor, and then follow the tutorial to run your impulse. Or, you can modify the example firmware (based on nRF Connect) to interact with other accelerometers or PDM microphones that are supported by Zephyr.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
With all the software in place it's time to connect the development board to Edge Impulse.
Remove the pin header protectors on the nRF5340 DK and plug the X-NUCLEO-IKS02A1 shield into the development board.
Note: Make sure that the shield does not touch any of the pins in the middle of the development board. This might cause issues when flashing the board or running applications.
Use a micro-USB cable to connect the development board to your computer. There are two USB ports on the development board, use the one on the short side of the board. Then, set the power switch to 'on'.
The development board does not come with the right firmware yet. To update the Firmware + Networking Component:
Firmware + Networking Component This firmware contains both the application and the networking core firmware component.
The development board is mounted as a USB mass-storage device (like a USB flash drive), with the name JLINK
. Make sure you can see this drive.
Drag and drop the nrf5340-dk-full.hex
firmware from the downloaded zip in this Programmer application (this firmware contains both application and networking core firmware).
Click “Erase & Write” and wait for device to boot up.
From a command prompt or terminal, run:
This starts a wizard which asks you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
The nRF5340 DK exposes multiple UARTs. If prompted, choose the bottom one:
With everything set up you can now build your first machine learning model with these tutorials:
If your board fails to flash new firmware (a FAIL.txt
file might appear on the JLINK
drive) you can also flash using nrfjprog
.
Flash new firmware via:
The Nordic Semiconductor nRF9160 DK is a development board with an nRF9160 SIP incorporating a Cortex M-33 for your application, a full LTE-M/NB-IoT modem with GPS along with 1 MB of flash and 256 KB RAM. It also includes an nRF52840 board controller with Bluetooth Low Energy connectivity. The Development Kit is fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio. As the nRF9160 DK does not have any built-in sensors we recommend you to pair this development board with the shield (with a MEMS accelerometer and a MEMS microphone).
If you don't have the X-NUCLEO-IKS02A1 shield you can use the to capture data from any other sensor, and then follow the tutorial to run your impulse. Or, you can modify the example firmware (based on nRF Connect) to interact with other accelerometers or PDM microphones that are supported by Zephyr.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
With all the software in place it's time to connect the development board to Edge Impulse.
Remove the pin header protectors on the nRF9160 DK and plug the X-NUCLEO-IKS02A1 shield into the development board.
Note: Make sure that the shield does not touch any of the pins in the middle of the development board. This might cause issues when flashing the board or running applications. You can also remove the shield before flashing the board.
Use a micro-USB cable to connect the development board to your computer. There are two USB ports on the development board, use the one on the short side of the board. Then, set the power switch to 'on'.
The development board does not come with the right firmware yet. To update the firmware:
The development board is mounted as a USB mass-storage device (like a USB flash drive), with the name JLINK
. Make sure you can see this drive.
Flash the board controller, you only need to do this once. Go to step 4 if you've performed this step before.
Ensure that the PROG/DEBUG
switch is in nRF52
position.
Copy board-controller.bin
to the JLINK
mass storage device.
Flash the application:
Ensure that the PROG/DEBUG
switch is in nRF91
position.
Run the flash script for your Operating System.
Wait 20 seconds and press the BOOT/RESET button.
From a command prompt or terminal, run:
This starts a wizard which asks you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
The nRF9160 DK exposes multiple UARTs. If prompted, choose the top one:
With everything set up you can now build your first machine learning model with these tutorials:
The Nordic Semiconductor nRF9151 DK is a development board with an nRF9151 SIP incorporating a Cortex M33 for your application, a full LTE-M/NB-IoT and DECT NR+ modem with GPS along with 1 MB of flash and 256 KB RAM. The Development Kit is fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio. As the nRF9151 DK does not have any built-in sensors we recommend you to pair this development board with the shield (with a MEMS accelerometer and a MEMS microphone).
If you don't have the X-NUCLEO-IKS02A1 shield you can use the to capture data from any other sensor, and then follow the tutorial to run your impulse. Or, you can modify the example firmware (based on nRF Connect) to interact with other accelerometers or PDM microphones that are supported by Zephyr.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
With all the software in place it's time to connect the development board to Edge Impulse.
Remove the pin header protectors on the nRF9151 DK and plug the X-NUCLEO-IKS02A1 shield into the development board.
Note: Make sure that the shield does not touch any of the pins in the middle of the development board. This might cause issues when flashing the board or running applications. You can also remove the shield before flashing the board.
Use a USB-C cable to connect the development board to your computer. Then, set the power switch to 'on'.
The development board does not come with the right firmware yet. To update the firmware:
The development board is mounted as a USB mass-storage device (like a USB flash drive), with the name JLINK
. Make sure you can see this drive.
Flash the application by running the flash script for your Operating System.
Wait 20 seconds and press the BOOT/RESET button.
From a command prompt or terminal, run:
This starts a wizard which asks you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
The nRF9151 DK exposes multiple UARTs. If prompted, choose the top one:
With everything set up you can now build your first machine learning model with these tutorials:
The Nordic Semiconductor nRF7002 DK is the development kit for the nRF7002 WiFi 6 companion IC. The kit contains everything you need to get started with your development on a single board. It features an nRF5340 multiprotocol System-on-a-Chip (SoC) as a host processor for the nRF7002 - and it is now supported by Edge Impulse.
The nRF7002 is a Wi-Fi 6 companion IC, providing seamless connectivity and Wi-Fi-based locationing (SSID sniffing of local Wi-Fi hubs). It is designed to be used alongside Nordic’s existing nRF52 and nRF53 Series Bluetooth SoCs, and nRF91 Series cellular IoT Systems-in-Package (SiPs). The nRF7002 can also be used in conjunction with non-Nordic host devices.
With its integration with Edge Impulse, you will be able to sample raw data, build models, and deploy trained machine learning models directly from the studio. As the nRF7002 DK does not have any built-in sensors, we recommend pairing this development board with the shield (with a MEMS accelerometer).
If you don't have access to the X-NUCLEO-IKS02A1 shield, you can use our to capture data from any other sensor, and then follow the tutorial to run your impulse. Or, you can modify the example firmware (based on nRF Connect) to interact with other accelerometers or PDM microphones that are supported by Zephyr.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Note that for the nRF7002 DK, the required J-Link version is V7.94e
Problems installing the CLI?
With all the software in place it's time to connect the development board to Edge Impulse.
Remove the pin header protectors on the nRF7002 DK and plug the X-NUCLEO-IKS02A1 shield into the development board.
Note: Make sure that the shield does not touch any of the pins in the middle of the development board. This might cause issues when flashing the board or running applications.
Use a micro-USB cable to connect the development board to your computer. There are two USB ports on the development board, use the one on the short side of the board. Then, set the power switch on the bottom left to 'on'.
The development board does not come with the right firmware yet. To update the Firmware + Networking Component:
Firmware + Networking Component This firmware contains both the application and the networking core firmware component.
The development board is mounted as a USB mass-storage device (like a USB flash drive), with the name JLINK
. Make sure you can see this drive.
Drag and drop the nrf7002-dk-full.hex
firmware from the downloaded zip in this Programmer application (this firmware contains both application and networking core firmware).
Click “Erase & Write” and wait for device to boot up.
From a command prompt or terminal, run:
This starts a wizard which asks you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
The nRF5340 DK exposes multiple UARTs. If prompted, choose the bottom one:
Once the nRF7002 DK is connected to a project in your profile, it will prompt to setup a WiFi connection:
Select 'Yes' at this step to connect your device to your local WiFi network. After selecting 'Yes', the daemon will scan for WiFi networks in the vicinity and print out a list for you to choose:
Navigate to your WiFi network to select it and enter password when prompted. Your kit will then connect to the WiFi and then connect to the project you selected in step 4.
Since your device is now connected via WiFi, you should be able to disconnect the daemon and collect your sensor data over the WiFi connection.
This will then display the inference results, in this case classify the motion of the nRF7002 DK board, in the terminal:
Note: In order to receive and view these inference results, you will need to have the X-NUCLEO-IKS02A1 shield connected to the DK since there are no sensors on the DK board itself.
With everything set up you can now build your first machine learning model with this tutorial:
If your board fails to flash new firmware (a FAIL.txt
file might appear on the JLINK
drive) you can also flash using nrfjprog
.
Flash new firmware via:
. A utility program we will use to flash firmware images onto the target.
The which will enable you to connect your CY8CKIT-062S2 Pioneer Kit directly to Edge Impulse Studio, so that you can collect raw data and trigger in-system inferences.
See the guide.
Edge Impulse Studio can collect data directly from your CY8CKIT-062S2 Pioneer Kit and also help you trigger in-system inferences to debug your model, but in order to allow Edge Impulse Studio to interact with your CY8CKIT-062S2 Pioneer Kit you first need to flash it with our .
, and unzip the file. Once downloaded, unzip it to obtain the firmware-infineon-cy8ckit-062s2.hex
file, which we will be using in the following steps.
Use a micro-USB cable to connect the CY8CKIT-062S2 Pioneer Kit to your development computer (where you downloaded and installed ).
You can use to flash your CY8CKIT-062S2 Pioneer Kit with our . To do this, first select your board from the dropdown list on the top left corner. Make sure to select the item that starts with CY8CKIT-062S2-43012
:
Keep Handy
will be needed to upload any other project built on Edge Impulse, but the base firmware image only has to be loaded once.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices on the left sidebar. The device will be listed there:
.
.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
. A utility program we will use to flash firmware images onto the target.
The which will enable you to connect your CY8CKIT-062-BLE Pioneer Kit directly to Edge Impulse Studio, so that you can collect raw data and trigger in-system inferences.
See the guide.
Edge Impulse Studio can collect data directly from your CY8CKIT-062-BLE Pioneer Kit and also help you trigger in-system inferences to debug your model, but in order to allow Edge Impulse Studio to interact with your CY8CKIT-062-BLE Pioneer Kit you first need to flash it with our .
, and unzip the file. Once downloaded, unzip it to obtain the firmware-infineon-cy8ckit-062-ble.hex
file, which we will be using in the following steps.
Use a micro-USB cable to connect the CY8CKIT-062-BLE Pioneer Kit to your development computer (where you downloaded and installed ).
You can use to flash your CY8CKIT-062-BLE Pioneer Kit with our . To do this, first select your board from the dropdown list on the top left corner. Make sure to select the item that starts with CY8CKIT-062-BLE-XXXX
:
Keep Handy
will be needed to upload any other project built on Edge Impulse, but the base firmware image only has to be loaded once.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices on the left sidebar. The device will be listed there:
Edge Impulse projects may be found in . These examples allow you to quickly develop applications around machine learning models and the Edge Impulse SDK. If you need to update the model you may from your project and unzip the resulting downloaded folder into your ModusToolBox application.
Firmware that is deployed via the Infineon PSoC 63 BLE Pioneer Kit in the Deployment section of an Edge Impulse project come with BLE connectivity. One may download the Infineon for your device and connect. Please watch this short video as a demonstration.
.
.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
.
.
The has instructions for macOS and Linux.
See the guide.
, and unzip the file.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
.
.
.
.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
The analog sensor and LIS3DHTR module were tested on ESP32 FireBeetle board and .
The pins used for camera connection on different development boards are not the same, therefore you will need to change the #define to fit your development board, compile and flash the firmware. Specifically for AI-Thinker ESP-CAM, since this board needs an external USB to TTL Serial Cable to upload the code/communicate with the board, the data transfer baud rate must be changed to 115200 .
The analog sensor and LIS3DH accelerometer can be used on any other development board without changes, as long as the interface pins are not changed. If I2C/ADC pins that accelerometer/analog sensor are connected to are different, from described in Sensors available section, you will need to in LIS3DHTR component for ESP32, compile and flash it to your board.
Generate an (ESP32-EYE only)
Download a (using ESP-IDF)
Download an
.
See the guide.
, and unzip the file.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
If you export to the Himax WE-I Plus you could receive the error: "All licenses are in use by other developers.". Unfortunately we have a limited number of licenses for the MetaWare compiler and these are shared between all Studio users. Try again in a little bit, or export your project as a C++ Library, add it to the project and compile locally.
If no device shows up in your OS (ie: COMxx, /dev/tty.usbxx) after connecting the board and your USB cable supports data transfer, you may need to install .
To set up your Blues Wireless Swan, follow this complete guide: .
The Blues Wireless Swan will guide you through how to create a simple classification model with an accelerometer designed to analyze movement over a brief period of time (2 seconds) and infer how the motion correlates to one of the following four states:
For more insight into using a triple axis accelerometer to build an embedded machine learning model visit the .
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
.
See the guide.
If this is not the case, see at the bottom of this page.
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
.
.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
Download the latest and drag the .bin
file onto the BOOTLOADER
drive.
Install the .
.
See the guide.
.
Install and open the and go to the Programmer application
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
.
.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
Install the .
.
See the guide.
Install the .
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
.
.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
.
See the guide.
nRF9151 DK can be configured with Board Configurator tool that is inside . All information on how this tool works and how to install it can be found in . For our application the board need to have following configuration:
Install the .
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
.
.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
.
Install for your device.
See the guide.
.
Install and open the and go to the Programmer application
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
When using the nRF7002 DK, you will not be able to connect to the Nordic nRF Edge Impulse app on your phone. The best way to flash your model is by navigating to the Deployment tab of your project in the studio on your PC and downloading the built firmware from there. You can follow the instructions in of the Nordic docs to flash the model onto your device.
The default firmware for the nRF7002 DK provided above, ships with a default motion detection model. This model is created from the and it's . To see the inferencing results of this model, reconnect the the device to your computer with a USB cable and run:
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
Install the .
Sensor
Axis names
#define SAMPLE_ACCELEROMETER
accX, accY, accZ
#define SAMPLE_GYROSCOPE
gyrX, gyrY, gyrZ
#define SAMPLE_ORIENTATION
heading, pitch, roll
#define SAMPLE_ENVIRONMENTAL
temperature, barometer, humidity, gas
#define SAMPLE_ROTATION_VECTOR
rotX, rotY, rotZ, rotW
Sensor
Axis names
#define SAMPLE_ACCELEROMETER
accX, accY, accZ
#define SAMPLE_GYROSCOPE
gyrX, gyrY, gyrZ
#define SAMPLE_PROXIMITY
cm
The Nordic Semiconductor Thingy:91 is an easy-to-use battery-operated prototyping platform for cellular IoT using LTE-M, NB-IoT and GPS. It is ideal for creating Proof-of-Concept (PoC), demos and initial prototypes in your cIoT development phase. Thingy:91 is built around the nRF9160 SiP and is certified for a broad range of LTE bands globally, meaning the Nordic Thingy:91 can be used just about anywhere in the world. There is an nRF52840 multiprotocol SoC on the Thingy:91. This offers the option of adding Bluetooth Low Energy connectivity to your project.
Nordic's Thingy:91 is fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: edgeimpulse/firmware-nordic-thingy91.
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
See the Installation and troubleshooting guide.
Before you start a new project, you need to update the Thingy:91 firmware to our latest build.
Use a micro-USB cable to connect the development board to your computer. Then, set the power switch to 'on'.
Download the latest Edge Impulse firmware. The extracted archive contains the following files:
firmware.hex
: the Edge Impulse firmware image for the nRF9160 SoC, and
connectivity-bridge.hex
: a connectivity application for the nRF52840 that you only need on older boards (hardware version < 1.4)
Open nRF Connect for Desktop and launch the Programmer application.
Scroll down in the menu on the right and make sure Enable MCUboot is selected.
Switch off the Nordic Thingy:91.
Press the multi-function button (SW3) while switching SW1 to the ON position.
In the Programmer navigation bar, click Select device.
In the menu on the right, click Add HEX file > Browse, and select the firmware.hex file from the firmware previously downloaded.
Scroll down in the menu on the right to Device and click Write:
In the MCUboot DFU window, click Write. When the update is complete, a Completed successfully message appears.
You can now disconnect the board.
Thingy:91 hardware version < 1.4.0
Updating the firmware with older hardware versions may fail. Moreover, even if the update works, the device may later fail to connect to Edge Impulse Studio:
In these cases, you will also need to flash the connectivity-bridge.hex
onto the nRF52840 in the Thingy:91. Follow the steps here to update the nRF52840 SOC application with the connectivity-bridge.hex
file through USB.
If this method doesn't work, you will need to flash both hex files using an external probe."
With all the software in place it's time to connect the development board to Edge Impulse. From a command prompt or terminal, run:
This starts a wizard which asks you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
The Thingy:91 exposes multiple UARTs. If prompted, choose the first one:
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with this tutorial:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
The Nordic Semiconductor nRF9161 DK is a development board with an nRF9161 SIP incorporating a Cortex M33 for your application, a full LTE-M/NB-IoT and DECT NR+ modem with GPS along with 1 MB of flash and 256 KB RAM. The Development Kit is fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio. As the nRF9161 DK does not have any built-in sensors we recommend you to pair this development board with the X-NUCLEO-IKS02A1 shield (with a MEMS accelerometer and a MEMS microphone).
If you don't have the X-NUCLEO-IKS02A1 shield you can use the Data forwarder to capture data from any other sensor, and then follow the Running your impulse locally: On your Zephyr-based Nordic Semiconductor development board tutorial to run your impulse. Or, you can modify the example firmware (based on nRF Connect) to interact with other accelerometers or PDM microphones that are supported by Zephyr.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: edgeimpulse/firmware-nrf-9161.
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
See the Installation and troubleshooting guide.
With all the software in place it's time to connect the development board to Edge Impulse.
Remove the pin header protectors on the nRF9161 DK and plug the X-NUCLEO-IKS02A1 shield into the development board.
Note: Make sure that the shield does not touch any of the pins in the middle of the development board. This might cause issues when flashing the board or running applications. You can also remove the shield before flashing the board.
Use a USB-C cable to connect the development board to your computer. Then, set the power switch to 'on'.
nRF9161 DK can be configured with Board Configurator tool that is inside nRF Connect for Desktop. All information on how this tool works and how to install it can be found in the document page. For our application the board need to have following configuration:
The development board does not come with the right firmware yet. To update the firmware:
The development board is mounted as a USB mass-storage device (like a USB flash drive), with the name JLINK
. Make sure you can see this drive.
Install the nRF Command Line Tools.
Flash the application by running the flash script for your Operating System.
Wait 20 seconds and press the BOOT/RESET button.
From a command prompt or terminal, run:
This starts a wizard which asks you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
The nRF9161 DK exposes multiple UARTs. If prompted, choose the top one:
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
The Nordic Thingy:53™ is an easy-to-use prototyping platform, it makes it possible to create prototypes and proof-of-concepts without the need to build custom hardware. Thingy:53 is built around the nRF5340 SoC. The capacity of its dual Arm Cortex-M33 processors enables it to do embedded machine learning (ML), both collecting data and running trained ML models on the device. The Bluetooth Low Energy radio allows it to connect to smart phones, tablets, laptops and similar devices, without the need for a wired connection. Other protocols like Thread, Zigbee and proprietary 2.4 GHz protocols are also supported by the radio. It also includes a well of different integrated sensors, an NFC antenna, and has two buttons and one RGB LED that simplifies input and output.
Nordic's Thingy:53 is fully supported by Edge Impulse and every Thingy:53 is shipped with Edge Impulse firmware already flashed. You'll be able to sample raw data, build models, and deploy trained machine learning models directly out-of-the-box via the Edge Impulse Studio or the Nordic nRF Edge Impulse iPhone and Android apps over BLE connection.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: edgeimpulse/firmware-nordic-thingy53.
To set this device up in Edge Impulse via USB serial or external debug probe, you will need to install the following software:
nRF Connect for Desktop v3.11.1 (only needed to update device firmware through USB or external debug probe).
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
See the Installation and troubleshooting guide.
A brand new Thingy:53 devices will work out-of-the-box with the Edge Impulse Studio and the Nordic nRF Edge Impulse iPhone and Android apps. However, if your device has been flashed with some other firmware, then follow the steps below to update your device to the latest Edge Impulse firmware.
Use a USB cable to connect the development board to your computer. Then, set the power switch to 'on'.
Download the latest Edge Impulse firmware:
Edge Impulse firmware: nordic-thingy53-full.zip
*-full.zip
contains HEX files to upgrade the device through the external probe.
Edge Impulse firmware: nordic-thingy53-dfu.zip
*-dfu.zip
contains dfu_application.zip
package to upgrade the already flashed device through the Serial/USB bootloader.
Follow Nordic's instructions to update the firmware on the Thingy:53 through your choice of debugging connection:
See the section below on Connecting to the nRF Edge Impulse mobile application.
With all the software in place it's time to connect the development board to Edge Impulse. From a command prompt or terminal, run:
This starts a wizard which asks you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
If prompted to select a device, choose ZEPHYR
:
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with this tutorial:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
Now that you have created an Edge Impulse account and trained your first Edge Impulse machine learning model, using the Nordic nRF Edge Impulse app you can deploy your impulse to your Nordic Thingy:53 and acquire/upload new sensor data into your Edge Impulse projects.
Select the Devices tab to connect to your Thingy:53 device to your mobile phone:
To remove your connected Thingy:53 from your project, select the connected device name and scroll to the bottom of the device page to remove it.
To view existing data samples in your Edge Impulse project, select the Data Acquisition tab. To record and upload a new data sample into your project, click on the "+" button at the top right of the app. Select your sensor, type in the sample label, and choose a sample length and frequency, then select Start Sampling.
Build and deploy your Edge Impulse model to your Thingy:53 via the Deployment tab. Select your project from the top drop-down, select your connected Thingy:53 device, and click Build:
The app will start building your project and uploading the firmware to the connected Thingy:53:
If you encounter connection errors during deployment, please see Troubleshooting.
Every Thingy:53 is shipped with a default Edge Impulse model. This model is created from the Tutorial: Continuous motion recognition and it's corresponding Edge Impulse project.
Select the Inferencing tab to view the inference results of the model flashed to the connected Thingy:53:
The Nordic Thingy53 can also be using with the nRF7002eb expansion board as shown below.
The nRF7002eb is a companion IC, providing seamless WiFi connectivity and WiFi-based locationing (SSID sniffing of local WiFi hubs). With WiFi 6 the nRF7002eb brings added benefits to IoT applications including further efficiency gains that support long-life, battery-powered WiFi operation.
With this expansion board, you will be able to collect and upload data from your Thingy53 to your application over a WiFi connection.
The WiFi capabilities of the Thingy53 are sandboxed in a different firmware. This helps users to choose whether they want to use the WiFi module or not and prevents consumption of extra memory if they choose not to. Therefore, to enable the WiFi capabilities of Thingy53, download the following Edge Impulse firmware:
Edge Impulse firmware: nordic-thingy53-WiFi-full.zip
*-full.zip
contains HEX files to upgrade the device through the external probe.
Edge Impulse firmware: nordic-thingy53-WiFi-dfu.zip
*-dfu.zip
contains dfu_application.zip
package to upgrade the already flashed device through the Serial/USB.
Connect the Thingy53 to your computer with a USB-C cable and update the firmware following instructions described in section 3 of updating the firmware.
Once the firmware has been updated, you will need to set up the WiFi connection between the Thingy53 and your WiFi. Make sure that the nRF7002eb WiFi module is plugged into the Thingy53 as shown in the image above. To setup the WiFi connection, simply run:
This starts a wizard that helps you login to your Edge Impulse account and choose a project you want to connect your device to.
Note: If you want to switch accounts, projects or WiFi network, run the command with
--clean
When prompted to select a device, choose the option with the higher USB modem number:
The wizard will now proceed to read the configuration of the device. If no WiFi connection is found, you will be prompted to connect to one. After you select Yes
, it will proceed to scan for available WiFi networks:
Once scanning is complete, it will show you a list of available networks. You can use the arrow keys to select your network and proceed to enter the password. Once the Thingy53 is connected to the WiFi, the daemon will automatically disconnect as there's no need to keep a serial connection open.
You can now disconnect the USB-C cable and remove the physical connection of the Thingy53 to your computer.
Now your device should be connected to the project you chose during the initial setup. To verify that the device is connected, navigate to the Devices tab of your project. The connected device should be listed here:
You can now move to the Data acquisition tab of your project and start collecting data without being restricted to where your computer is.
When using the nRF7002eb expansion board, you will not be able to connect to the Nordic nRF Edge Impulse app on your phone. The best way to flash your model is by navigating to the Deployment tab of your project and downloading the built firmware from there. You can follow the instructions in section 3 of updating the firmware to flash your model onto your device.
The firmware for the Thingy53 provided above, ships with a default motion detection model. This model is created from the Tutorial: Continuous motion recognition and it's corresponding Edge Impulse project. To see the inferencing results of this model, reconnect the the device to your computer with a USB-C cable and run:
This will then display the inference results, in this case classify the motion of the Thingy53, in the terminal:
The integration of the nRF7002eb with Edge Impulse allows users to integrate advanced machine learning models, enabling smarter and more responsive IoT applications with even more ease. The synergy between the nRF7002eb EK and Edge Impulse paves the way for innovative applications in areas such as predictive maintenance, anomaly detection, and real-time data analysis.
Select the Settings tab to view your logged-in account information, BLE scanner settings, and application version. Click on your account name to view your Edge Impulse projects and logout of your account.
Lost BLE connection to device
Reconnect your device by selecting your device name on the Devices tab and click "Reconnect".
Make sure power cables are plugged in properly.
Do not use iPhone/Android app multitasking during data acquisition, firmware deployment, or inferencing tasks, as the BLE streaming connection will be closed.
Switch WiFi network or project
Reconnect your device to your computer using a USB-C cable.
Run edge-impulse-daemon --clean
. End the process by pressing CTRL+c
on your keyboard, do not login at this step.
Disconnect the Thingy53 and restart using the switch on the side.
Reconnect to your computer and run edge-impulse-daemon
. Follow instructions above to choose a different project of WiFi network.
Community board
This is a community board by RAKwireless and is not maintained by Edge Impulse. For support, head to the RAKwireless homepage or the RAKwireless forums.
The RAKwireless WisBlock is a modular development system that lets you combine different cores and sensors to easily construct your next Internet of Things (IoT) device. The following WisBlock cores work with Edge Impulse:
RAK11200 (ESP32)
RAK4631 (nRF52840)
RAK11310 (RP2040)
RAKwireless has created an in-depth tutorial on how to get started using the WisBlock with Edge Impulse, including collecting raw data from a 3-axis accelerometer or a microphone, training a machine learning, and deploying the model to the WisBlock core.
A WisBlock starter kit can be found in the RAKwireless store.
Install the following software:
Follow the guide for your particular core to collect data, train a machine learning model, and deploy it to your WisBlock:
By the end of the guide, you should have machine learning inference running locally on your WisBlock!
The Boron is a powerful cellular enabled development kit.
Equipped with the Nordic nRF52840 and u-blox SARA U201 (2G/3G) or R410M/R510S LTE Cat M1 module, the Boron has built-in battery charging circuitry which makes it easier to connect a Li-Po battery and 20 mixed signal GPIOs to interface with sensors, actuators, and other electronics.
The Boron is great for connecting existing projects to the Particle Device Cloud or as a gateway to connect an entire group of local endpoints where Wi-Fi is missing or unreliable.
To set this device up in Edge Impulse, you will need to install the following software:
Particle Workbench (Optional, only required if deploying to Particle Library)
Problems installing the CLI?
See the Installation and troubleshooting guide.
Connect the ADXL345 to the Boron as follows:
Create an Edge Impulse account if you haven't already. Also, be sure to setup the device per the instructions above.
Build your first machine learning model with this tutorial:
If you choose to deploy your project to a Particle Library and not a binary follow these steps to flash the your firmware from Particle Workbench:
Open a new VS Code window, ensure that Particle Workbench has been installed (see above)
Use VS Code Command Palette and type in Particle: Import Project
Select the project.properties
file in the directory that you just downloaded and extracted from the section above.
Use VS Code Command Palette and type in Particle: Configure Project for Device
Select deviceOS@5.5.0
Choose a target. (e.g. P2 , this option is also used for the Boron).
It is sometimes needed to manually put your Device into DFU Mode. You may proceed to the next step, but if you get an error indicating that "No DFU capable USB device available" then please follow these step.
Hold down both the RESET and MODE buttons.
Release only the RESET button, while holding the MODE button.
Wait for the LED to start flashing yellow.
Release the MODE button.
Compile and Flash in one command with: Particle: Flash application & DeviceOS (local)
Local Compile Only! At this time you cannot use the Particle: Cloud Compile or Particle: Cloud Flash options; local compilation is required.
The following video demonstrates how to collect raw data from an accelerometer and develop an application around the Edge Impulse inferencing library with the Boron.
Flashing your Particle device requires the Particle command line tool. Follow these instructions to install the tools.
Navigate to the directory where your Boron firmware downloaded and decompress the zip file. Open a terminal and use the following command to flash your device:
The OpenMV Cam is a small and low-power development board with a Cortex-M7 microcontroller supporting MicroPython, a μSD card socket and a camera module capable of taking 5MP images - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models through the studio and the OpenMV IDE.
To set this device up in Edge Impulse, you will need to install the following software:
Problems installing the CLI?
See the installation and troubleshooting guide.
With all the software in place it's time to connect the development board to Edge Impulse. To make this easy we've put some tutorials together which takes you through all the steps to acquire data, train a model, and deploy this model back to your device.
Adding sight to your sensors - end-to-end tutorial.
Collecting image data with the OpenMV Cam H7 Plus - collecting datasets using the OpenMV IDE.
Running your impulse on your OpenMV camera - run your trained impulse on the OpenMV Cam H7 Plus.
The Photon 2 with Edge ML Kit is a development system with a microcontroller and Wi-Fi networking containing a Realtek RTL8721DM MCU ARM Cortex M33. The form-factor is similar to the Argon (Adafruit Feather), but the Photon 2 supports 2.4 GHz and 5 GHz Wi-Fi, BLE, and has much larger RAM and flash that can support larger applications. Included in the kit are sensors used for embedded machine learning inferencing.
It is intended to replace both the Photon and Argon modules. It contains the same module as the P2, making it easier to migrate from a pin-based development module to a SMD mass-production module if desired.
To set this device up in Edge Impulse, you will need to install the following software:
Particle Workbench (Optional, only required if deploying to Particle Library)
Problems installing the CLI?
See the Installation and troubleshooting guide.
Connect the ADXL362 to the Photon 2 as follows:
Connect the microphone to the Photon 2 as follows:
Plug in the USB Cable to the device
Working directly with the device through the Particle Library deployment option involves the use of the Particle Workbench in VS Code, but if you simply want to start gathering data for a project you only need to install the Edge Impulse CLI and flash the following firmware to your device with your sensor(s) connected as described in the section below.
Alternatively you can clone the Particle Firmware repo and build the firmware locally.
Flashing your Particle device requires the Particle command line tool. Follow these instructions to install the tools.
Navigate to the directory where your Photon2 firmware downloaded and decompress the zip file. Open a terminal and use the following command to flash your device:
Before starting ingestion create an Edge Impulse account if you haven't already. Also, be sure to setup the device per the instructions above.
To collect data from the Photon 2 please follow these steps:
Create a new Edge Impulse Studio project, remember the name you create for it.
Connect your device to the Edge Impulse studio by running following command in a terminal:
After connecting, the Edge Impulse Daemon will ask to login to your account and select the project. Alternatively, you can copy the API Key from the Keys section of your project and use the --api-key flag instead of --clean.
Start gathering data by clicking on Data acquisition
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different devices or sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
If you choose to deploy your project to a Particle Library and not a binary follow these steps to flash the your firmware from Particle Workbench:
Open a new VS Code window, ensure that Particle Workbench has been installed (see above)
Use VS Code Command Palette and type in Particle: Import Project
Select the project.properties
file in the directory that you just downloaded and extracted from the section above.
Use VS Code Command Palette and type in Particle: Configure Project for Device
Select deviceOS@5.5.0
Choose a target. (e.g. P2 , this option is also used for the Photon 2).
It is sometimes needed to manually put your Device into DFU Mode. You may proceed to the next step, but if you get an error indicating that "No DFU capable USB device available" then please follow these step.
Hold down both the RESET and MODE buttons.
Release only the RESET button, while holding the MODE button.
Wait for the LED to start flashing yellow.
Release the MODE button.
Compile and Flash in one command with: Particle: Flash application & DeviceOS (local)
Local Compile Only! At this time you cannot use the Particle: Cloud Compile or Particle: Cloud Flash options; local compilation is required.
The following video demonstrates how to collect raw data from an accelerometer and develop an application around the Edge Impulse inferencing library with the Photon 2.
If you would like to use the Particle webhook to send training data from your particle board directly to Edge Impulse, or indeed any other of our apis follow these steps:
Access Particle Console:
Visit Particle Console.
Log in with your Particle account credentials.
Navigate to Integrations:
Click on the "Integrations" tab in the left-hand menu.
Select "Webhooks" from the available options.
Create a New Webhook:
Click "New Integration".
Choose "Webhook".
Webhook Configuration:
Name: Assign a descriptive name to your webhook.
Event Name: Specify the event name that triggers the webhook (e.g., "edge/ingest").
URL: Set this to the Edge Impulse ingestion API URL, typically something like https://ingestion.edgeimpulse.com/api/training/data
.
Request Type: Choose "POST".
Request Format: Select "Custom".
Custom Request Body:
Input the JSON structure required by Edge Impulse. This will vary based on your project's data schema.
HTTP Headers:
Add necessary headers:
x-api-key
: Your Edge Impulse API key.
Content-Type
: "application/json".
x-file-name
: Use a dynamic data field like {{PARTICLE_EVENT_NAME}}
.
Advanced Settings:
Response Topic: Create a custom topic for webhook responses, e.g., {{PARTICLE_DEVICE_ID}}/hook-response/{{PARTICLE_EVENT_NAME}}
.
Enforce SSL: Choose "Yes" for secure transmission.
Save the Webhook:
After entering all details, click "Save".
Test the Webhook:
Use example device firmware to trigger the webhook.
Observe the responses in the Particle Console.
Debugging:
If errors occur, review the logs for detailed information.
Ensure payload format aligns with Edge Impulse requirements.
Verify the accuracy of your API key and other details.
If you have any trouble with the config you can copy and paste the following into the Custom Template section of the webhook:
The Photon 2 is capable of OTA and updating to the latest model in your Edge Impulse project. Follow this example that shows how to deploy updated impulses over-the-air (OTA) using the Particle Workbench.
Should you have any issues with your Particle device please review Particle's Support & Troubleshooting page.
If you have issues with Edge Impulse please reach out!
Community board
This is a community board by Seeed Studios, and it's not maintained by Edge Impulse. For support head to the .
The Seeed Wio Terminal is a development board from Seeed Studios with a Cortex-M4 microcontroller, motion sensors, an LCD-display, and Grove connectors to easily connect external sensors. Seeed Studio has added support for this development board to Edge Impulse, so you can sample raw data and build machine learning models from the studio.
With the impulse designed, trained and verified you can deploy this model back to your Wio Terminal. This makes the model run without an internet connection, minimizes latency, and runs with minimum power consumption. Edge Impulse can package up the complete impulse - including the signal processing code, neural network weights, and classification code - up in a single library that you can run on your development board.
ESP-NN Conflict Workaround
With the recent addition of ESP-NN acceleration, the Wio Terminal will attempt to build the ESP-NN files in the Arduino library, which results in several errors during linking. While we work on a permanent solution, remove the ESP-NN folder to compile Wio Terminal applications with the Edge Impulse SDK.
If you see an error like the following Arduino, it means the Wio Terminal build process is attempting to link to the ESP-NN library (which is not supported):
While we work on a permanent solution, the workaround is to remove the ESP-NN/ folder found in Arduino/libraries/<ei-project-name>/src/edge-impulse-sdk/porting/espressif/
Community board
This is a community board by Seeed Studios, and it's not maintained by Edge Impulse. For support head to the .
The Seeed Studio XIAO nRF52840 Sense incorporates the Nordic nRF52840 chip with FPU, operating up to 64 MHz, mounted multiple development ports, carrying Bluetooth 5.0 wireless capability and can operate with low power consumption. Featuring onboard IMU and PDM, it can be your best tool for embedded Machine Learning projects. Seeed Studio has added support for this development board to Edge Impulse, so you can sample raw data and build machine learning models from the studio.
With the impulse designed, trained and verified you can deploy this model back to your XIAO nRF52840 Sense. This makes the model run without an internet connection, minimizes latency, and runs with minimum power consumption. Edge Impulse can package up the complete impulse - including the signal processing code, neural network weights, and classification code - up in a single library that you can run on your development board.
Community board
This is a community board by Seeed Studios, and it's not maintained by Edge Impulse. For support head to the .
The Seeed Studio XIAO ESP32S3 Sense is a powerful development board that utilizes the dual-core ESP32S3 chip, featuring an Xtensa processor running at speeds of up to 240 MHz. This board offers support for both 2.4GHz Wi-Fi and Bluetooth Low Energy (BLE) connectivity. Additionally, it is equipped with a detachable OV2640 camera sensor, boasting an impressive resolution of 1600*1200, and a digital microphone.
With 8MB of PSRAM and 8MB of FLASH, as well as an external SD card slot, the XIAO ESP32S3 Sense provides ample memory and storage capacity, making it well-suited for embedded machine learning (ML) applications.
Seeed Studio has integrated support for this development board into Edge Impulse, allowing users to sample raw data and build machine learning models directly from the studio. This integration simplifies the process of leveraging the XIAO ESP32S3 Sense for ML projects.
With its impressive specifications and Edge Impulse compatibility, the XIAO ESP32S3 Sense is an excellent choice for developers seeking to explore embedded machine learning applications.
With the impulse designed, trained and verified you can deploy this model back to your XIAO ESP32S3 Sense. This makes the model run without an internet connection, minimizes latency, and runs with minimum power consumption. Edge Impulse can package up the complete impulse - including the signal processing code, neural network weights, and classification code - up in a single library that you can run on your development board.
The Renesas CK-RA6M5, Cloud Kit for RA6M5 MCU Group, enables users to experience the cloud connectivity options available from Renesas and Renesas Partners. A broad array of sensors on the CK-RA6M5 provide multiple options for observing user interaction with the Cloud Kit. By selecting from a choice of add-on devices, multiple cloud connectivity options are available.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
An earlier prototype version of the Renesas CK-RA6M5 Cloud Kit is also supported. The layout of this earlier prototype version is available .
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
Edge Impulse Studio can collect data directly from your CK-RA6M5 Cloud Kit and also help you trigger in-system inferences to debug your model, but in order to allow Edge Impulse Studio to interact with your CK-RA6M5 Cloud Kit you first need to flash it with our base firmware image.
Check that:
J22 is set to link pins 2-3
J21 link is closed
J16 Link is open
Connect J14 and J20 on the CK-RA6M5 board to USB ports on the host PC using the 2 micro USB cables supplied.
Power LED (LED6) on the CK-RA6M5 board lights up white, indicating that the CK-RA6M5 board is powered on.
If the CK-RA6M5 board is not powered through the Debug port (J14) the current available to the board may be limited to 100 mA.
Open the flash script for your operating system (flash_windows.bat
, flash_mac.command
or flash_linux.sh
) to flash the firmware.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
With everything set up you can now build your first machine learning model with these tutorials:
The RA8 is the first Cortex-M85 microcontroller in the market. The EK-RA8D1, an Evaluation Kit for the RA8D1 MCU Group, enables users to seamlessly evaluate the features of the RA8D1 MCU group and develop embedded systems applications using the Renesas Flexible Software Package (FSP) and e2 studio IDE. Users can use rich on-board features along with their choice of popular ecosystems add-ons to bring their big ideas to life.
The evaluation kit comes with a MIPI graphics expansion board mounted with an LCD display and a camera expansion board mounted with the OV3640 CSI camera. The kit can be assembled as follows:
This kit is put together primarily for image based applications. This document will get you started so you can create your own image based applications using Edge Impulse.
The RA8D1 supports all of Edge Impulse’s device features, including ingestion, remote management and inferencing. To set the device up in Edge Impulse, you will need to install the following software:
Problems installing the CLI?
Edge Impulse Studio can collect image data directly from your EK-RA8D1 and also help you trigger in-system inferences to debug your model. In order to allow Edge Impulse Studio to interact with your device, you first need to flash it with our base firmware image.
To flash the board, you need to connect to the debug port J10:
The LCD screen will turn on and display the home screen as shown.
Open the flash script for your operating system (flash_win.bat
, flash_mac.command
or flash_linux.sh
) to flash the firmware. Once you flash the firmware, the display will switch to show the default Edge Impulse Face Detection using FOMO project.
Now you are ready to connect the RA8D1 to the studio and create your own project.
To connect to the Edge Impulse Studio, you need to connect to the USB Full Speed port J11. Please make sure that the jumpers J12 and J15 are in the correct position (J12 in position 2-3 and J15 connected). The correct configuration is shown in the image below:
It is important to remember that to run inference, to collect data from the RA8D1, and use the Edge Impulse CLI tools, you have to connect via port J11. Port J10 is only used for flashing the firmware.
Note that it is safe to connect two cables to your board at ports J10 and J11 simultaneously. Then you can flash it and run inference without having to change ports.
After connecting to port J11, from a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Once the device is connected, you can proceed to the data acquisition tab and start collecting your image data directly from the device.
With everything set up you can now build your first image based machine learning model with these tutorials:
The is the debut microcontroller from Raspberry Pi - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio. It's available for around $4 from Raspberry Pi foundation and a wide range of distributors.
To get started with the Raspberry Pi RP2040 and Edge Impulse you'll need:
A . The pre-built firmware and Edge Impulse Studio exported binary are tailored for , but with a few simple steps you can collect the data and run your models with other RP2040-based boards, such as . For more details, check out .
(Optional) If you are using the , the makes it easier to connect external sensors for data collection/inference.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
To set this device up in Edge Impulse, you will need to install the following software:
If you'd like to interact with the board using a set of pre-defined AT commands (not necessary for standard ML workflow), you will need to also install a serial communication program, for example minicom
, picocom
or use Serial Monitor from Arduino IDE (if installed).
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
With all the software in place, it's time to connect the development board to Edge Impulse.
Use a micro-USB cable to connect the development board to your computer while holding down the BOOTSEL button, forcing the Raspberry Pi Pico into USB Mass Storage Mode.
The development board does not come with the right firmware yet. To update the firmware:
Drag the ei_rp2040_firmware.uf2
file from the folder to the USB Mass Storage device.
Wait until flashing is complete, unplug and replug in your board to launch the new firmware.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
With everything set up you can now build your first machine learning model. Since Raspberry Pi Pico does not have any built-in sensors, we decided to add the following ones to be supported out of the box, with a pre-built firmware:
Analog signal sensor (pin A0).
Once you have the compatible sensors, you can then follow these tutorials:
Support for Arduino RP2040 Connect was added to the official RP2040 firmware for Edge Impulse. That includes data acquisition and model inference support for:
onboard MP34DT05 microphone
onboard ST LSM6DSOX 6-axis IMU
the sensors described above still can be connected
While RP2040 is a relatively new microcontroller, it was already utilized to build several boards:
The official Raspberry Pi Pico RP2040
Arducam Pico4ML (Camera, screen and microphone)
Seeed Studio XIAO RP2040 (extremely small footprint)
Black Adafruit Feather RP2040 (built-in LiPoly charger)
And others. While pre-built Edge Impulse firmware is mainly tested with Pico board, it is compatible with other boards, with the exception of I2C sensors and microphone - different boards use different pins for peripherals, so if you’d like to use LSM6DS3/LSM6DSOX accelerometer & gyroscope modules or microphone, you will need to change pin values in Edge Impulse RP2040 firmware source code, recompile it and upload it to the board.
Open the app and login with your edgeimpulse.com credentials:
Select your Thingy:53 project from the drop-down menu at the top:
ADXL345 | Boron |
---|---|
ADXL362 | Photon 2 |
---|---|
PDM Mic | Photon 2 |
---|---|
Open your Edge Impulse Studio Project and click on Devices. Verify that your device is listed here.
To set up your Seeed Wio Terminal, follow this guide: .
With everything set up you can now build your first machine learning model with this full end-to-end course from Seeed's EDU team: .
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
The easiest way to deploy your impulse to the Seeed Wio Terminal is via an Arduino library. See for more information.
To set up your Seeed Studio XIAO nRF52840 Sense, follow this guide: .
With everything set up you can now build your first machine learning model: .
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
The easiest way to deploy your impulse to the Seeed XIAO nRF52840 Sense is via an Arduino library. See for more information.
To set up your Seeed XIAO ESP32S3 Sense, follow this guide:
With everything set up you can now build your first machine learning model: .
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
The easiest way to deploy your impulse to the XIAO ESP32S3 Sense is via an Arduino library. See for more information.
.
See the guide.
, and unzip the file, then locate the flash-script
folder included, which we will be using in the following steps.
An earlier prototype version of the Renesas CK-RA6M5 Cloud Kit required a USB to Serial interface as shown . This is no longer the case.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices on the left sidebar. The device will be listed there:
.
.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
.
See the guide.
, and unzip the file. Within this folder, you will find several flashing files for different operating systems (MacOS, Linux and Windows). Locate the file for your respective OS, and follow the next steps.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices on the left sidebar. The device will be listed there:
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
You can now also take advantage of NVIDIA TAO models for your machine learning applications for improved performance. TAO models typical occupy more space than is internally available on the RA8. For this reason, flashing the RA8D1 with a TAO model is slightly more involved than simply downloading the firmware. For instructions on how to accomplish this please see for a detailed tutorial on using your RA8D1 with TAO models.
.
.
.
See the guide.
, and unzip the file.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
(GP16; pin D16 on Grove Shield for Pi Pico).
(GP18; pin D18 on Grove Shield for Pi Pico).
(I2C0).
There is a vast variety of analog signal sensors, that can take advantage of RP2040 10-bit ADC (Analog to Digital Converter), from common ones, such as Light sensor, Sound level sensor to more specialized ones, e.g. , or even an .
.
.
.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
VCC
3V3
GND
GND
SCL
SCL
SDA
SDA
CS
VCC
VCC
3V3
GND
GND
CS
D13/A2
SCK
D17
MISO (SDO)
D16 (MISO)
MOSI (SDA)
D15 (MOSI)
INT1
not connected
INT2
not connected
VCC
3V3
GND
GND
SEL
Not connected
CLK
A0
DAT
A1
The ST IoT Discovery Kit (also known as the B-L475E-IOT01A) is a development board with a Cortex-M4 microcontroller, MEMS motion sensors, a microphone and WiFi - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: edgeimpulse/firmware-st-b-l475e-iot01a.
Two variants of this board
There are two variants of this board, the B-L475E-IOT01A1 (US region) and the B-L475E-IOT01A2 (EU region) - the only difference is the sub-GHz radio. Both are usable in Edge Impulse.
To set this device up in Edge Impulse, you will need to install the following software:
On Windows:
ST Link - drivers for the development board. Run dpinst_amd64
on 64-bits Windows, or dpinst_x86
on 32-bits Windows.
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?"
See the Installation and troubleshooting guide.
With all the software in place it's time to connect the development board to Edge Impulse.
Use a micro-USB cable to connect the development board to your computer. There are two USB ports on the development board, use the one the furthest from the buttons.
The development board does not come with the right firmware yet. To update the firmware:
The development board is mounted as a USB mass-storage device (like a USB flash drive), with the name DIS_L4IOT
. Make sure you can see this drive.
Drag the DISCO-L475VG-IOT01A.bin
file to the DIS_L4IOT
drive.
Wait until the LED stops flashing red and green.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, choose an Edge Impulse project, and set up your WiFi network. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
If you experience the following error when attempting to connect to a WiFi network:
You have hit a known issue with the firmware for this development board's WiFi module that results in a timeout during network scanning if there are more than 20 WiFi access points detected. If you are experiencing this issue, you can work around it by attempting to reduce the number of access points within range of the device, or by skipping WiFi configuration.
If the LED does not flash red and green when you copy the .bin
file to the device and instead is a solid red color, and you are unable to connect the device with Edge Impulse, there may be an issue with your device's native firmware.
To restore functionality, use the following tool from ST to update your board to the latest version:
You might need to set up udev rules on Linux before being able to talk to the device. Create a file named /etc/udev/rules.d/50-stlink.rules
and add the following content:
Then unplug the development board and plug it back in.
Sony's Spresense is a small, but powerful development board with a 6 core Cortex-M4F microcontroller and integrated GPS, and a wide variety of add-on modules including an extension board with headphone jack, SD card slot and microphone pins, a camera board, a sensor board with accelerometer, pressure, and geomagnetism sensors, and Wi-Fi board - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio.
To get started with the Sony Spresense and Edge Impulse you'll need:
The Spresense Main Development board.
The Spresense Extension Board - to connect external sensors.
A micro-SD card to store samples - this is a necessary add-on as the board will not be able to operate without storing samples.
In addition, you'll want some sensors, these ones are fully supported (note that you can collect data from any sensor on the Spresense with the data forwarder):
For image models: the Spresense CXD5602PWBCAM1 camera add-on or the Spresense CXD5602PWBCAM2W HDR camera add-on.
For accelerometer models: the Spresense Sensor EVK-70 add-on.
For audio models: an electret microphone and a 2.2K Ohm resistor, wired to the extension board's audio channel A, following this schema (picture here).
Note: for audio models you must also have a FAT formatted SD card for the extension board, with the Spresense's DSP files included in a BIN
folder on the card, see instructions here and a screenshot of the SD card directory here.
For other sensor models: see below for SensiEDGE CommonSense support.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: edgeimpulse/firmware-sony-spresense.
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
See the Installation and troubleshooting guide.
With all the software in place it's time to connect the development board to Edge Impulse.
An SD card is necessary to use the Spresense. Make sure it is formatted in FAT format before inserting it into the Spresense.
Use a micro-USB cable to connect the main development board (not the extension board) to your computer.
The development board does not come with the right firmware yet. To update the firmware:
Install Python 3.7 or higher.
Download the latest Edge Impulse firmware, and unzip the file.
Open the flash script for your operating system (flash_windows.bat
, flash_mac.command
or flash_linux.sh
) to flash the firmware.
Wait until flashing is complete. The on-board LEDs should stop blinking to indicate that the new firmware is running.
From a command prompt or terminal, run:
Mac: Device choice
If you have a choice of serial ports and are not sure which one to use, pick /dev/tty.SLAB_USBtoUART or /dev/cu.usbserial-*
This will start a wizard which will ask you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
Edge Impulse has partnered with SensiEdge to add support for sensor fusion applications to the Sony Spresense by integrating the SensiEDGE CommonSense sensor extension board. The CommonSense comes with a wide array of sensor functionalities that connect seamlessly to the Spresense and the Edge Impulse studio. In addition to the Sony Spresense, the Spresense extension board and a micro-SD card, you will need the CommonSense board which is available to purchase on Mouser.
Connect the Sony Spresense extension board to the Sony Spresense ensuring that the micro-SD card is loaded. Connect the SensiEDGE CommonSense in the orientation shown below - with the connection ports facing the same direction. The HD camera is optional but can be attached if you want to create an image based application.
Once the boards are connected, start the Edge Impulse daemon from a command prompt or terminal:
This starts a wizard which asks you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
If prompted to select a device, choose commonsense
:
Verify that the device is connected by going to the Devices tab in your project and checking for the green light as mentioned in the steps above.
Once your device is connected, you are now ready to collect data directly from your CommonSense board and start creating your machine learning application.
If you want to reset the firmware to the default Sony-CommonSense firmware, you can download it here, flash your Sony Spresense and be ready to start again.
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
If you see:
Upgrade pyserial:
If the edge-impulse-daemon
or edge-impulse-run-impulse
commands do not start it might be because of an error interacting with the SD card or because your board has an old version of the bootloader. To see the debug logs, run:
And press the RESET button on the board. If you see Welcome to nash
you'll need to update the bootloader. To do so:
Install and launch the Arduino IDE.
Go to Preferences and under 'Additional Boards Manager URLs' add https://github.com/sonydevworld/spresense-arduino-compatible/releases/download/generic/package_spresense_index.json
(if there's already text in this text box, add a ,
before adding the new URL).
Then go to Tools > Boards > Board manager, search for 'Spresense' and click Install.
Select the right board via: Tools > Boards > Spresense boards > Spresense.
Select your serial port via: Tools > Port and selecting the serial port for the Spresense board.
Select the Spresense programmer via: Tools > Programmer > Spresense firmware updater.
Update the bootloader via Tools > Burn bootloader.
Then update the firmware again (from step 3: Update the bootloader and the firmware).
The Silicon Labs Thunderboard Sense 2 is a complete development board with a Cortex-M4 microcontroller, a wide variety of sensors, a microphone, Bluetooth Low Energy and a battery holder - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio - and even stream your machine learning results over BLE to a phone.
This board is Not recommended for new designs. For a replacement, see the EFR32xG24 Dev Kit which is also fully supported by Edge Impulse
The Edge Impulse firmware for this development board is open source and hosted on on GitHub: edgeimpulse/firmware-silabs-thunderboard-sense-2.
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the CLI?
See the Installation and troubleshooting guide.
With all the software in place it's time to connect the development board to Edge Impulse.
Use a micro-USB cable to connect the development board to your computer. The development board should mount as a USB mass-storage device (like a USB flash drive), with the name TB004
. Make sure you can see this drive.
The development board does not come with the right firmware yet. To update the firmware:
Drag the silabs-thunderboard-sense2.bin
file to the TB004
drive.
Wait 30 seconds.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
Our firmware is equipped with a simple BLE demo showing how to start/stop the inference over the BLE and acquire the results.
To use the demo, first install the EFR Connect BLE Mobile App on your mobile phone:
Make sure your board is flashed with a pre-built binary. Power on the board and run the EFR Connect BLE Mobile App
Scan your neighborhood for BLE devices
Look for the device named Edge Impulse and tap Connect
Scroll down to Unknown service with UUID DDA4D145-FC52-4705-BB93-DD1F295AA522
and select More Info
Select Write for characteristics with UUID 02AA6D7D-23B4-4C84-AF76-98A7699F7FE2
In the Hex field enter 01
and press Send. This will start inferencing, the device should start blinking with LEDs.
For another characteristic with UUID 61A885A4-41C3-60D0-9A53-6D652A70D29C
enable Notify and observe the reported inference results.
To stop the inference, send 00
to the characteristics 02AA6D7D-23B4-4C84-AF76-98A7699F7FE2
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
Did you know? You can also stream the results of your impulse over BLE to a nearby phone or gateway: see Streaming results over BLE to your phone.
When dragging and dropping an Edge Impulse pre-built .bin firmware file, the binary seems to flash, but when the device reconnects a FAIL.TXT file appears with the contents "Error while connecting to CPU" and the following errors appear from the Edge Impulse CLI impulse runner:
To fix this error, install the Simplicity Studio 5 IDE and flash the binary through the IDE's built in "Upload application..." menu under "Debug Adapters", and select your Edge Impulse firmware to flash:
Your Edge Impulse inferencing application should then run successfully with edge-impulse-run-impulse
.
The Texas Instruments CC1352P Launchpad is a development board equipped with the multiprotocol wireless CC1352P microcontroller. The Launchpad, when paired with the BOOSTXL-SENSORS booster packs, is fully supported by Edge Impulse, and is able to sample accelerometer & microphone data, build models, and deploy directly to the device without any programming required.
If you don't have either booster pack or are using different sensing hardware, you can use the Data forwarder to capture data from any other sensor type, and then follow the [Running your impulse locally]/run-inference/cpp-library/) tutorial to run your impulse. Or, you can clone and modify the open source firmware-ti-launchxl project on GitHub.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: edgeimpulse/firmware-ti-launchxl.
To set this device up in Edge Impulse, you will need to install the following software:
Install the desktop version for your operating system here
Add the installation directory to your PATH
See Troubleshooting for more details
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the Edge Impulse CLI?
See the Installation and troubleshooting guide.
With all the software in place it's time to connect the development board to Edge Impulse.
To interface the Launchpad with sensor hardware, you will need to either connect the BOOSTXL-SENSORS to collect accelerometer data, or the CC3200AUDBOOST to collect audio data. Follow the guides below based on what data you want to collect.
Before you start
The Launchpad jumper connections should be in their original configuration out of the box. If you have already modified the jumper connections, see the Launchpad's User Guide for the original configuration.
2. Connect the development board to your computer
Use a micro-USB cable to connect the development board to your computer.
3. Update the firmware
The development board does not come with the right firmware yet. To update the firmware:
Download the latest Edge Impulse firmware, and unzip the file.
Open the flash script for your operating system (flash_windows.bat
, flash_mac.command
or flash_linux.sh
) to flash the firmware.
Wait until flashing is complete, and press the RESET button once to launch the new firmware.
Problems flashing firmware onto the Launchpad?
See the Troubleshooting section for more information.
3. Setting keys
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Which device do you want to connect to?
The Launchpad enumerates two serial ports. The first is the Application/User UART, which the edge-impulse firmware communicates through. The other is an Auxiliary Data Port, which is unused.
When running the edge-impulse-daemon
you will be prompted on which serial port to connect to. On Mac & Linux, this will appear as:
Generally, select the lower numbered serial port. This usually corresponds with the Application/User UART. On Windows, the serial port may also be verified in the Device Manager
If a selected serial port fails to connect. Test the other port before checking troubleshooting for other common issues.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
4. Verifying that the device is connected
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build and run your first machine learning model with these tutorials:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse, and you can [run your impulse locally]/run-inference/cpp-library/) with custom firmware or sensor data.
Failed to flash
If the UniFlash CLI is not added to your PATH, the install scripts will fail. To fix this, add the installation directory of UniFlash (example /Applications/ti/uniflash_6.4.0
on macOS) to your PATH on:
If during flashing you encounter further issues, ensure:
The device is properly connected and/or the cable is not damaged.
You have the proper permissions to access the USB device and run scripts. On macOS you can manually approve blocked scripts via System Preferences->Security Settings->Unlock Icon
If on Linux you may want to try copying tools/71-ti-permissions.rules to /etc/udev/rules.d/. Then re-attach the USB cable and try again.
Alternatively, the gcc/build/edge-impulse-standalone.out
binary file may be flashed to the Launchpad using the UniFlash GUI or web-app. See the Texas Instruments Quick Start Guide for more info.