Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
RASynBoard is a tiny (25mm x 30mm), ultra-low power, edge AI/ML board, based on a Syntiant® NDP120 Neural Decision Processor™ (NDP), a Renesas RA6M4 host MCU plus a power efficient DA16600 Wi-Fi/BT combo module. The NDP120 subsystem with on-board digital microphone, IMU motion sensor and SPI Flash memory, achieves highly efficient processing of acoustic and motion events. Battery and USB-C device connectors facilitate standalone use, while a compact under-board connector enables integration with custom OEM boards and additional sensors.
An IO board (50mm x 30mm) is included for implementation of a compact two-board evaluation kit assembly. This pins-out a subset of the NDP120 and RA6M4 I/Os to popular Pmod, Click header and expansion header footprints, enabling connection with additional external microphones and sensor options. An onboard debugger MCU (SWD and UART interfaces), button switches, RGB LED and removable MicroSD storage, further maximize prototyping versatility and utility.
NDP120 AI/ML models for popular use-cases (pre-engineered by Syntiant® and other vendors) are loaded from local SPI Flash storage for efficient execution on the ultra-low power NDP120 neural accelerator device.
RA6M4 MCU application software development and debug is supported via the Renesas e2 Studio IDE, interfaced via the E2OB debugger MCU on the IO board. Key Features
Accelerated Edge-AI and ML applications
Battery-powered remote sensor systems
Industrial smart sensors
Motor predictive maintenance
Always-on speech recognition and sensor fusion processing
Getting Started Guides may be found at Avnet's Github repositories:
To set this device up in Edge Impulse, you will need to install the following software. The Renesas software will require registration for a Renesas account.
Download the Edge Impulse RASynBoard firmware for audio or IMU below and connect the USB cable to your computer:
Follow the instructions in the .zip's for installation instructions
After flashing the MCU per the RASynBoard Development Guide instructions, please reconnect the Avnet RASynBoard directly to your computer's USB port. Linux, Mac OS, and Windows platforms are supported. From a command prompt or terminal, run:
This will start a wizard which will ask you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
Use syntiant compatible pre-processing blocks
The Avnet RASynBoard is based on the Syntiant NDP120 Neural Decision Processor™ and needs to use dedicated Syntiant DSP blocks.
With everything set up you can now build your first machine learning model and evaluate it using one of these tutorials:
How to label my classes? The NDP chip expects one and only negative class and it should be the last in the list. For instance, if your original dataset looks like: yes, no, unknown, noise
and you only want to detect the keyword 'yes' and 'no', merge the 'unknown' and 'noise' labels in a single class such as z_openset
(we prefix it with 'z' in order to get this class last in the list).
RFP Error(E3000107): This device does not match the connection parameters If you encounter this error while programming the Renesas device on the RASynBoard please follow this workaround.
WiseEye™ seamlessly integrates the Himax proprietary ultralow power AI processors, always-on CMOS image sensors, and advanced CNN-based AI algorithms, revolutionizing battery-powered, on-device vision AI applications. With power consumption of just a few milliwatts, WiseEye™ targets battery-powered endpoint AI device markets to drive AI for everyday life. Such devices typically demand extended battery life to minimize maintenance and enhance usability. WiseEye™ delivers intuitive and intelligent user interactions, making advanced AI sensing possible even in power-constrained environments. By bringing advanced, user-friendly AI capabilities, WiseEye™ sets a new standard for endpoint AI, offering unmatched performance and extended operational lifetimes.
Quick links access:
Information about Himax WiseEye2 module and ISM board
To set this board up in Edge Impulse, you will need to install the following software:
Start with an x64-based Windows image and install git
Clone the Himax WiseEye-Module-G1 SDK:
Follow the SDK setup instructions to setup and prepare the Himax WiseEye-Module-G1 SDK for use.
Follow the setup instructions EVK and PC Tool User Guide to setup and prepare the WE2_DEMO_TOOL for flashing.
Now that your Himax WiseEye2 ISM Devboard is ready and you've configured the WE2_DEMO_TOOL, lets proceed to install the Edge Impulse dependencies (you will typically do this on a Linux or MacOS platform). This will be to RUN the Edge Impulse model once its flashed using the Windows platform.
To set this board up in Edge Impulse, you will need to install the following software - typically on a linux or macos based system.
Please install the "edge-impulse-cli" package. Full documentation on installing the edge impulse CLI can be found here: Edge Impulse CLI
Problems installing the Edge Impulse CLI?
See the Installation and troubleshooting guide.
Next, we head to the Edge Impulse Studio to build our ML "impulse".
First, lets build and run our first machine learning model with these tutorials:
For the Himax WiseEye2 ISM Devboard, you will choose "Himax ISM" for the Target type in Edge Impulse. When performing the "Deployment" step, please also select and choose the "Himax ISM" platform as the deployment platform target. You will also need to ensure that you create your impulse/model with "Int8 Profiling" enabled. You will need to select the "Quantized int8" checkbox when you perform the model deployment.
You can utilize the Himax WiseEye2 ISM Devboard itself to help with image capturing/data collection for your project by connecting your ISM Devboard to your development platform and then run the "edge-impulse-daemon" as follows (this can be done on Linux/MacOS or Windows if you have the "edge-impulse-cli" package installed):
When launched, you will be prompted to log into your Edge Impulse account, select a project, select the associated USB port that the ISM Devboard is connected to, and finally give the device a name. You can then look in your Edge Impulse "Devices" tab to see the device by going to your Edge Impulse project, and click Devices. The device will be listed here:
You can then select your device, within Edge Impulse Studio, to use the camera/sensors to capture data for your project's data.
When the deployment is complete, you will receive a zip file that will contain two files:
firmware.img - the OTA image you will use to publish to your ISM Devboard via the "ota.exe" tool you reviewed above.
readme.txt - text file will a link to this page to to review the steps if needed.
We'll take the "firmware.img" file and proceed to the next step
You will next run the WE_DEMO_TOOL on Windows:
Select "Burn Flash", then next we press "Select File" to select the directory and file where we have placed our Edge Impulse contents (namely firmware.img and readme.txt from above). Select that directory and the "firmware.img" file:
We then press the "Start" button and allow the flashing process to complete:
You can now disconnect the board and proceed to the Linux/MacOS platform to run the model in the next step.
To run the model on your ISM Devboard now that the flashing has finished, you plug in the board via USB and then run the following in a bash shell:
After logging in and selecting the appropriate USB port that represents your board, You will now see your model's inference output displayed as data is entered (images captured/etc...)
Alternatively, you can connect directly to the USB serial port and then directly interact with the AT command interpreter that is running the Edge Impulse model:
The Grove Vision AI Module V2 (Himax WiseEye2) is a highly efficient MCU-based smart vision module driven by the Himax WiseEye2 HX6538 processor, featuring a dual-core Arm Cortex-M55 and integrated Arm Ethos-U55 neural network component. It integrates Arm Helium technology which is finely optimized for vector data processing, enables a significant uplift in DSP and ML capabilities without compromising on power consumption, which is ideal for battery-powered applications.
Capabilities: Utilizes WiseEye2 HX6538 processor with a dual-core Arm Cortex-M55 and integrated Arm Ethos-U55 neural network unit.
Versatile AI Model Support: Easily deploy off-the-shelf or your custom AI models from SenseCraft AI, including Mobilenet V1, V2, Efficientnet-lite, Yolo v5 & v8. TensorFlow and PyTorch frameworks are supported.
Rich Peripheral Devices: Includes PDM microphone, SD card slot, Type-C, Grove interface, and other peripherals.
High Compatibility: Compatible with XIAO series, Arduino, Raspberry Pi, ESP dev board, easy for further development
Fully Open Source: All codes, design files, and schematics available for modification and use.
Quick links access:
Firmware source code: GitHub repository
Edge Impulse pre-compiled firmware: seeed-grove-vision-ai-module-v2.zip
To set this board up in Edge Impulse, you will need to install the following software:
Note: Make sure that you have the CLI tools version at least 1.27.1. You can check it with:
On Linux, please install screen:
Problems installing the Edge Impulse CLI?
See the Installation and troubleshooting guide.
With all the software in place it's time to connect the board to Edge Impulse.
The board does not come with the right Edge Impulse firmware yet. To update the firmware:
Download the latest Edge Impulse firmware and extract it
Connect the board to the PC/Mac/Linux via USB Type-C cable
Within the extracted firmware zip file, there are install scripts to flash your device:
For MacOS:
For Windows:
For Linux:
Additionally, you need to flash the model file to your board. You can find the model in the model_vela.tflite
file.
Clone the firmware source code from our repository
Open terminal in the root of the repository
Install required Python packages
Flash the model
In each case, you will select the serial port for your device and the flashing script will perform the firmware update.
Note: If the flashing script waits for you to press the "reset" (RST) button but never moves on from that point, its likely that you have an outdated himax-flash-tool and need to update your host's install per previous instructions above.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build and run your first machine learning model with these tutorials:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
After building the machine learning model and downloading the Edge Impulse firmware from Edge Impulse Studio, deploy the model to your Seeed Grove Vision AI Module V2 via steps 1 and 2 of "Connecting to Edge Impulse" above.
The Synaptics Katana KA10000 board is a low-power AI evaluation kit from Synaptics that has the KA10000 AI Neural Network processor onboard. The evaluation kit is provided with a separate Himax HM01B0 QVGA monochrome camera module and 2 onboard zero power Vesper microphones. The board has an embedded STLIS2Dw12 accelerometer and an optional TI OPT3001 ambient light sensor. The connectivity to the board is provided with an IEEE 802.11n ultra low power WiFi module that is integrated with a Bluetooth 5.x, in addition to 4 Peripheral Modules (PMOD) connectors to provide I2C. UART, GPIO, I2S/SPI interfaces.
The package contains several accessories:
The Himax image sensor.
The PMOD-I2C USB firmware configuration board.
The PMOD-UART USB adapter.
2 AAA batteries
Enclosure.
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
In order to update the firmware, it is necessary to use the PMOD-I2C USB firmware configuration board. The PMOD-I2C board is connected to the Katana board on the north right PMOD-I2C interface (as shown in the image at the top of this page), then you need to use a USB C cable to connect the firmware configuration board to the host PC.
In addition to the PMOD-I2C configuration board. You need to connect the PMOD-UART extension to the Katana board which is located on the left side of the board. Then you need to use a Micro-USB cable to connect the board to your computer.
The board is shipped originally with a sound detection firmware by default. You can upload new firmware to the flash memory by following these instructions:
Verify that you have correctly connected the firmware configuration board.
Run the flash script for your operating system (flash_windows.bat
, flash_mac.command
or flash_linux.sh
) to flash the firmware.
Wait until flashing is complete.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
With everything set up you can now build your first machine learning model with these tutorials, and board-specific public projects:
The Silicon Labs xG24 Dev Kit (xG24-DK2601B) is a compact, feature-packed development platform built for the EFR32MG24 Cortex-M33 microcontroller. It provides the fastest path to develop and prototype wireless IoT products. This development platform supports up to +10 dBm output power and includes support for the 20-bit ADC as well as the xG24's AI/ML hardware accelerator. The platform also features a wide variety of sensors, a microphone, Bluetooth Low Energy and a battery holder - and it's fully supported by Edge Impulse! You'll be able to sample raw data as well as build and deploy trained machine learning models directly from the Edge Impulse Studio - and even stream your machine learning results over BLE to a phone.
To set this device up with Edge Impulse, you will need to install the following software:
Problems installing the CLI?
Edge Impulse Studio can collect data directly from your xG24 Dev Kit and also help you trigger in-system inferences to debug your model, but in order to allow Edge Impulse Studio to interact with your xG24 Dev Kit you first need to flash it with our base firmware image.
Then go to the "Flash" section on the left sidebar, and select the base firmware image file you downloaded in the first step above (i.e., the file named firmware-xg24.hex
). You can now press the Flash
button to load the base firmware image onto the xG24 Dev Kit.
Keep Simplicity Commander Handy
Simplicity Commander will be needed to upload any other project built on Edge Impulse, but the base firmware image only has to be loaded once.
With all the software in place, it's time to connect the xG24 Dev Kit to Edge Impulse.
Use a micro-USB cable to connect the development board to your computer.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
With everything set up you can now build your first machine learning model with these tutorials:
Our firmware is equipped with a simple BLE demo showing how to start/stop the inference over the BLE and acquire the results.
To use the demo, first install the EFR Connect BLE Mobile App on your mobile phone:
Scan your neighborhood for BLE devices
Look for the device named Edge Impulse and tap Connect
Scroll down to Unknown service with UUID DDA4D145-FC52-4705-BB93-DD1F295AA522
and select More Info
Select Write for characteristics with UUID 02AA6D7D-23B4-4C84-AF76-98A7699F7FE2
In the Hex field enter 01
and press Send. This will start inferencing, the device should start blinking with LEDs.
For another characteristic with UUID 61A885A4-41C3-60D0-9A53-6D652A70D29C
enable Notify and observe the reported inference results.
To stop the inference, send 00
to the characteristics 02AA6D7D-23B4-4C84-AF76-98A7699F7FE2
To set this device up in Edge Impulse, you will need to install the following software:
Download the Nicla Voice firmware for audio or IMU below and connect the USB cable to your computer:
The archive contains different scripts to flash the firmware on your OS, ie for macOS:
install_lib_mac.command: script will install the Arduino Core for the Nicla board and the pyserial package required to update the NDP120 chip. You only need to run this script once.
flash_mac.command: to flash both the MCU and NDP120 chip. You should use this script on a brand new board
The additional scripts below can be used for specific actions:
flash_mac_mcu.command: to flash only the Nordic MCU, ie if you recompiled the firmware and doesn't need to update the NDP120 model.
flash_mac_model.command: to flash only the NDP120 model.
format_mac_ext_flash.command: to format the external flash that contains the NDP120 model
After flashing the MCU and NDP chips, connect the Nicla Voice directly to your computer's USB port. Linux, Mac OS, and Windows platforms are supported. From a command prompt or terminal, run:
This will start a wizard which will ask you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Use syntiant compatible pre-processing blocks
The Arduino Nicla Voice is based on the Syntiant NDP120 Neural Decision Processor™ and needs to use dedicated Syntiant DSP blocks.
With everything set up you can now build your first machine learning model and evaluate it using the Arduino Nicla Voice Board with this tutorial:
How to label my classes? The NDP chip expects one and only negative class and it should be the last in the list. For instance, if your original dataset looks like: yes, no, unknown, noise
and you only want to detect the keyword 'yes' and 'no', merge the 'unknown' and 'noise' labels in a single class such as z_openset
(we prefix it with 'z' in order to get this class last in the list).
If you get quarantine warnings on MacOS when flashing the device try this command to unquarantine the files and then rerun the flashing command
IMU data acquisition - SD Card
An SD Card is required to use IMU data acquisition as the internal RAM of the MCU is too small. You don't need the SD Card for inferencing only or for audio projects.
To set this device up in Edge Impulse, you will need to install the following software:
Select one of the 2 firmwares below for audio or IMU projects:
Insert SD Card if you need IMU data acquisition and connect the USB cable to your computer. Double-click on the script for your OS. The script will flash the Arduino firmware and a default model on the NDP101 chip.
Flashing issues
0x000000: read 0x04 != expected 0x01
Some flashing issues can occur on the Serial Flash. In this case, open a Serial Terminal on the TinyML board and send the command: :F. This will erase the Serial Flash and should fix the flashing issue.
Connect the Syntiant TinyML Board directly to your computer's USB port. Linux, Mac OS, and Windows 10 platforms are supported.
Audio - USB microphone (macOS/Linux only)
Check that the Syntiant TinyML enumerates as "TinyML" or "Arduino MKRZero". For example, in Mac OS you'll find it under System Preferences/Sound:
Audio acquisition - Windows OS
Using the Syntiant TinyML board as an external microphone for data collection doesn't currently work on Windows OS.
IMU
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
With everything set up you can now build your first machine learning model and evaluate it using the Syntiant TinyML Board with this tutorial:
How to label my classes? The NDP101 chip expects one and only negative class and it should be the last in the list. For instance, if your original dataset looks like: yes, no, unknown, noise
and you only want to detect the keyword 'yes' and 'no', merge the 'unknown' and 'noise' labels in a single class such as z_openset
(we prefix it with 'z' in order to get this class last in the list).
Quick links access:
To set this board up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the Edge Impulse CLI?
With all the software in place it's time to connect the board to Edge Impulse.
BL702 is the USB-UART chip which enables the communication between the PC and the Himax chip. You need to update this firmware in order for the Edge Impulse firmware to work properly.
Connect the board to the PC via a USB Type-C cable while holding down the Boot button on the board
Open previously installed Bouffalo Lab Dev Cube software, select BL702/704/706, and then click Finish
Go to MCU tab. Under Image file, click Browse and select the firmware you just downloaded.
Click Refresh, choose the Port related to the connected board, set Chip Erase to True, click Open UART, click Create & Download and wait for the process to be completed .
You will see the output as All Success if it went well.
Note: If the flashing throws an error, try to click Create & Download multiple times until you see the All Success message.
The board does not come with the right Edge Impulse firmware yet. To update the firmware:
Connect the board again to the PC via USB Type-C cable and double-click the Boot button on the board to enter mass storage mode
After this you will see a new storage drive shown on your file explorer as GROVEAI. Drag and drop the firmware.uf2 file to GROVEAI drive
Once the copying is finished GROVEAI drive will disappear. This is how we can check whether the copying is successful or not.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
With everything set up you can now build and run your first machine learning model with these tutorials:
The Edge Impulse firmware for this board is open source and hosted on GitHub: .
.
, and unzip the file.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
Eggs AI:
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
. A utility program we will use to flash firmware images onto the target.
The which will enable you to connect your xG24 Dev Kit directly to Edge Impulse Studio, so that you can collect raw data and trigger in-system inferences.
See the guide.
, and unzip the file. Once downloaded, unzip it to obtain the firmware-xg24.hex
file, which we will be using in the following steps.
Use a micro-USB cable to connect the xG24 Dev Kit to your development computer (where you downloaded and installed ).
You can use to flash your xG24 Dev Kit with our . To do this, first select your board from the dropdown list on the top left corner:
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices on the left sidebar. The device will be listed there:
.
.
.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
Make sure your board is flashed with . Power on the board and run the EFR Connect BLE Mobile App
The is a development board with a high-performance microphone and IMU, a Cortex-M4 nRF52832 MCU and the . The NDP120 supports multiple Neural Network architectures and is ideal for always-on low-power speech recognition applications. You'll be able to sample raw data, build models, and deploy trained embedded machine learning models directly from the Edge Impulse studio to create the next generation of low-power, high-performance audio interfaces.
The Edge Impulse firmware for this development board is open source and hosted on .
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
How to use Arduino-CLI with macOS M1 chip? You will need to install Rosetta2 to run the Arduino-CLI. See details on .
The TinyML Board is a with a microphone and accelerometer, USB host microcontroller and an always-on Neural Decision Processor™, featuring ultra low-power consumption, a fully connected neural network architecture, and fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained embedded machine learning models directly from the Edge Impulse studio to create the next generation of low-power, high-performance audio interfaces.
The Edge Impulse firmware for this development board is open source and hosted on .
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
Using the Arduino-CLI with a macOS M1 chip? You will need to install Rosetta2 to run the Arduino-CLI. See details on .
is a thumb-sized board based on Himax HX6537-A processor which is equipped with a 2-Megapixel OV2640 camera, microphone, 3-axis accelerometer and 3-axis gyroscope. It offers storage with 32 MB SPI flash, comes pre-installed with ML algorithms for face recognition and people detection and supports customized models as well. It is compatible with the XIAO ecosystem and Arduino, all of which makes it perfect for getting started with AI-powered machine learning projects!
It is fully supported by Edge Impulse which means you will be able to sample raw data from each of the sensors, build models, and deploy trained machine learning models to the module directly from the studio without any programming required. Grove - Vision AI Module is available for purchase directly from .
Firmware source code:
Pre-compiled firmware:
.
Download the latest
See the guide.
(tinyuf2-grove_vision_ai_vX.X.X.bin)
and extract it to obtain firmware.uf2 file
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
After building the machine learning model and downloading the Edge Impulse firmware from Edge Impulse Studio, deploy the model uf2 to Grove - Vision AI by following steps 1 and 2 under .
If you want to compile the Edge Impulse firmware from the source code, you can visit and follow the instructions included in the README.
The model used for the official firmware can be found in this .
The Ensemble series of fusion processors from Alif Semiconductor utilize ARM's low power Cortex-M55 CPUs with dedicated Ethos-U55 microNPUs to run embedded ML workloads quickly and efficiently. The devices feature both 'High Power' cores designed for large model architectures, as well as 'High Efficiency' cores designed for low power continuous monitoring. The Development kit and AppKit are both fully supported by Edge Impulse. The Ensemble kits feature multiple core types, dual MEMS microphones, accelerometers, and a MIPI camera interface.
To get started with the Alif Ensemble and Edge Impulse you'll need the following hardware: Alif Ensemble AppKit or Alif Ensemble Development Kit.
To set this device up in Edge Impulse, you will need to install the following software:
The latest Alif Security Toolkit
:
Navigate to the Alif Semiconductor Kit documentation page (you will need to register to create an account with Alif, or log in to your existing Alif account). and download the latest App Security Toolkit (tested with version 0.56.0) for windows or linux.
Extract the archive, and read through the included Security Toolkit Quick Start Guide
to finalize the installation
IMPORTANT: Set an environmental variable called SETOOLS_ROOT
to the Security Toolkit root path. This is used by Edge Impulse scripts when flashing the Alif development kits. Example instructions for Linux, Windows, MacOS.
(Optional) Docker Desktop:
If you are using MacOS, we recommended installing Docker Desktop in order to use the Alif Security Toolkit for programming.
Once you have installed it's time to connect the development board to Edge Impulse.
To interface the Alif Ensemble AppKit or Development Kit, you'll need to connect your device to the USB port label PRG USB
Ethos-U55-128 library (High End Embedded, Shared SRAM) A C++ library with inferencing for devices with an Ethos-U55-128 NPU, High End Embedded with shared SRAM. For example: Alif E7 RTSS-HE.
Ethos-U55-256 library (High End Embedded, Shared SRAM) A C++ library with inferencing for devices with an Ethos-U55-256 NPU, High End Embedded with shared SRAM. For example: Alif E7 RTSS-HP
Alif AI/ML Kit Gen2 HE core Binaries containing both the Edge Impulse data acquisition client and your full impulse.
Alif AI/ML Kit Gen2 HP core Binaries containing both the Edge Impulse data acquisition client and your full impulse.
Alif AI/ML Kit Gen2 HP core - tensor arena statically allocated to SRAM Binaries containing both the Edge Impulse data acquisition client and your full impulse.
Alif Dev Kit Gen2 HE core Binaries containing both the Edge Impulse data acquisition client and your full impulse.
Alif Dev Kit Gen2 HP core Binaries containing both the Edge Impulse data acquisition client and your full impulse.
Alif Dev Kit Gen2 HP core - tensor arena statically allocated to SRAM Binaries containing both the Edge Impulse data acquisition client and your full impulse.
Ethos-U55-128 Open CMSIS Pack A C++ library in Open CMSIS pack format with for devices with an Ethos-U55-128 NPU, High End Embedded with shared SRAM. For example: Alif E7 RTSS-HE.
Ethos-U55-256 Open CMSIS Pack A C++ library in Open CMSIS pack format with for devices with an Ethos-U55-256 NPU, High End Embedded with shared SRAM. For example: Alif E7 RTSS-HP.
You can program and use serial port of the device if you adjust jumper J15 to connect pins 1-3 and 2-4. There will be two serial ports enumerated. The first port is used for programming, the second for serial communication.
Inspect isp_config_data.cfg
in the Security Toolkit directory to ensure the COM port is set correctly to the device attached to your computer. There will be two serial ports enumerated. The first port is used for programming, the second for serial communication.
After configuring the hardware, the next step is to flash the default Edge Impulse Firmware. This will allow us to collect data directly from your Ensemble device. To update the firmware:
Download the latest Edge Impulse firmware binary and unzip the file.
Open a terminal in the unzipped folder and run the following commands. Use the HE
, HP
, or HP_SRAM
parameter that matches the deployment chosen from the Edge Impulse project. That is, if you Deployed for HP_SRAM please use the HP_SRAM
parameters.
MACOS or LINUX
WINDOWS
To use the serial port of the device adjust jumper J15 to connect pins 1-3 and 2-4
Now, the Ensemble device can connect to the Edge Impulse CLI
installed earlier. To test the CLI for the first time, either:
Create a new project from the Edge Impulse project dashboard
OR
Clone an existing Edge Impulse public project, like this Face Detection Demo. Click the link and then press Clone
at the top right of the public project.
Then, from a command prompt or terminal on your computer, run:
Device choice
You may see two FTDI
or CYPRESS
serial ports enumerated for devices. If so, select the second entry in the list, which generally is the serial data connection to the Ensemble device. Ensure that the jumpers are correctly oriented for serial communication.
This will start a wizard which will ask you to log in and choose an Edge Impulse project. You should see your new or cloned project listed on the command line. Use the arrow keys and hit Enter
to select your project.
That's all! Your device is now connected to Edge Impulse. To verify this, go to
With everything set up you can now build your first machine learning model with these tutorials. This will walk you through the process of collecting data and training a new ML model:
Alternatively, you can test on-device inference with a demo model included in the base firmware binary. To do this, you may run the following command from your terminal:
Then, once you've tested out training and deployment with the Edge Impulse Firmware, learn how to integrate impulses with your own custom Ensemble based application:
The STM32N6570-DK Discovery kit is a development board with the STM32N657X0H3Q Arm® Cortex®‑M55‑based microcontroller featuring ST Neural-ART Accelerator, H264 encoder, NeoChrom 2.5D GPU and 4.2 MB of contiguous SRAM. The kit also includes a camera and microphone with extensions to support other sensors. The kit is fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the studio.
To set this device up in Edge Impulse, you will need to install the following software:
STM32 Cube Programmer. The firmware provided by Edge Impulse uses the CLI abilities of this program. Please add the program's location to your PATH using your operating system instructions.
Use a USB-C cable to connect the development board to your computer to the USB-C connector CN6
marked with STLINK-V3EC
. Depending on your connection you may need additional power which can be added via the USB-C connection CN8
marked USB1
.
If you are powering the board with just CN6
then place jumper JP2
on the 2 pins on the top (1-2:5V_STLK
).
If you are powering the board with CN6
and CN8
then place jumper JP2
on the 2 pins in the middle (3-4:5V_USB_STLK
).
If you have a new kit you may need to update the firmware of the onboard ST-LINK. Open the STM32 Cube Programmer application and follow the instructions found here on ST's website. Typically, you only need to click a few buttons to easily update the firmware on the ST-LINK device.
Three binaries must be programmed in the board external flash using the following procedure:
Download the default Edge Impulse firmware, model, and bootloader here. See the Readme and the scripts for the appropriate commands to flash the three files.
Switch BOOT1
on the board switch to right position and reset the board using button B1
labeled NRST
Program ai_fsbl_cut_2_0.hex
(First stage bootloader)
Program network_data.hex
(params of the networks; To be changed only when the network is changed)
Program firmware-st-stm32n6.bin
(firmware application)
Switch BOOT1
on the board to left position and reset the board using button B1
labeled NRST
When deploying a new binary both the network_data.hex
and the firmware-st-stm32n6.bin
need to be flashed. The bootloader only needs to be programmed once.
Please visit the STM32N6 Series site for further programming information
To start acquiring data from the device open a command prompt or terminal, run:
This will start a wizard which will ask you to log in, choose an Edge Impulse project, and establish communication between your development kit and the Edge Impulse project chosen. If you want to switch projects run the command with --clean
.
At this time only Object Detection projects using FOMO or YOLOV5 are support with the N6 on Edge Impulse.
To start inferencing with a model on this device you need to complete an Edge Impulse project and download the binary from the project and flash using the above steps. Once flashed you may use the command:
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with these tutorials:
There are two deployment options available for the STM32N6. If you are using the dev kit with the default connections you may get a fully built binary for your kit using the ST STM32N6
option. If you are wanting to work with source code and use on your own device please select the ST Neural-ART
library deployment option. The ST Neural-ART
option will generate source code that will use the ST accelerator found on the N6 series.
Errors during flashing:
also
Fully remove power from the board and then reconnect power to the board. Do not rely on the reset button to clear this error.
USB C Cables: Inspect USB cables and swap with ones that have both power and data lines. Attempt again with certified USB cables.