Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
The Grove Vision AI Module V2 (Himax WiseEye2) is a highly efficient MCU-based smart vision module driven by the Himax WiseEye2 HX6538 processor, featuring a dual-core Arm Cortex-M55 and integrated Arm Ethos-U55 neural network component. It integrates Arm Helium technology which is finely optimized for vector data processing, enables a significant uplift in DSP and ML capabilities without compromising on power consumption, which is ideal for battery-powered applications.
Capabilities: Utilizes WiseEye2 HX6538 processor with a dual-core Arm Cortex-M55 and integrated Arm Ethos-U55 neural network unit.
Versatile AI Model Support: Easily deploy off-the-shelf or your custom AI models from SenseCraft AI, including Mobilenet V1, V2, Efficientnet-lite, Yolo v5 & v8. TensorFlow and PyTorch frameworks are supported.
Rich Peripheral Devices: Includes PDM microphone, SD card slot, Type-C, Grove interface, and other peripherals.
High Compatibility: Compatible with XIAO series, Arduino, Raspberry Pi, ESP dev board, easy for further development
Fully Open Source: All codes, design files, and schematics available for modification and use.
Quick links access:
Firmware source code: [GitHub repository] - Coming soon!
Edge Impulse pre-compiled firmware: [Edge Impulse Himax Repo] - Coming soon!
To set this board up in Edge Impulse, you will need to install the following software:
Edge Impulse CLI To install the Edge Impulse CLI, please invoke:
The himax-flash-tool is a component of the Edge Impulse CLI package that comes from Himax and is used specifically to flash Himax boards. It supports multiple boards and has several usage options:
On Linux, please install screen:
Problems installing the Edge Impulse CLI?
See the Installation and troubleshooting guide.
With all the software in place it's time to connect the board to Edge Impulse.
The board does not come with the right Edge Impulse firmware yet. To update the firmware:
Download the latest Edge Impulse firmware and extract it (Firmware coming soon!)
Connect the board to the PC/Mac/Linux via USB Type-C cable and press the black "reset" button (RST)
Within the extracted firmware zip file, there are install scripts to flash your device:
For MacOS:
For Windows:
For Linux:
In each case, you will select the serial port for your device and the flashing script will perform the firmware update.
NOTE: If the flashing script waits for you to press the "reset" (RST) button but never moves on from that point, its likely that you have an outdated himax-flash-tool and need to update your host's install per previous instructions above.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build and run your first machine learning model with these tutorials:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
After building the machine learning model and downloading the Edge Impulse firmware from Edge Impulse Studio, deploy the model to your Seeed Grove Vision AI Module V2 via steps 1 and 2 of "Connecting to Edge Impulse" above.
is a thumb-sized board based on Himax HX6537-A processor which is equipped with a 2-Megapixel OV2640 camera, microphone, 3-axis accelerometer and 3-axis gyroscope. It offers storage with 32 MB SPI flash, comes pre-installed with ML algorithms for face recognition and people detection and supports customized models as well. It is compatible with the XIAO ecosystem and Arduino, all of which makes it perfect for getting started with AI-powered machine learning projects!
It is fully supported by Edge Impulse which means you will be able to sample raw data from each of the sensors, build models, and deploy trained machine learning models to the module directly from the studio without any programming required. Grove - Vision AI Module is available for purchase directly from .
Quick links access:
To set this board up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
Problems installing the Edge Impulse CLI?
With all the software in place it's time to connect the board to Edge Impulse.
BL702 is the USB-UART chip which enables the communication between the PC and the Himax chip. You need to update this firmware in order for the Edge Impulse firmware to work properly.
Connect the board to the PC via a USB Type-C cable while holding down the Boot button on the board
Open previously installed Bouffalo Lab Dev Cube software, select BL702/704/706, and then click Finish
Go to MCU tab. Under Image file, click Browse and select the firmware you just downloaded.
Click Refresh, choose the Port related to the connected board, set Chip Erase to True, click Open UART, click Create & Download and wait for the process to be completed .
You will see the output as All Success if it went well.
Note: If the flashing throws an error, try to click Create & Download multiple times until you see the All Success message.
The board does not come with the right Edge Impulse firmware yet. To update the firmware:
Connect the board again to the PC via USB Type-C cable and double-click the Boot button on the board to enter mass storage mode
After this you will see a new storage drive shown on your file explorer as GROVEAI. Drag and drop the firmware.uf2 file to GROVEAI drive
Once the copying is finished GROVEAI drive will disappear. This is how we can check whether the copying is successful or not.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
With everything set up you can now build and run your first machine learning model with these tutorials:
The is a development board with a high-performance microphone and IMU, a Cortex-M4 nRF52832 MCU and the . The NDP120 supports multiple Neural Network architectures and is ideal for always-on low-power speech recognition applications. You'll be able to sample raw data, build models, and deploy trained embedded machine learning models directly from the Edge Impulse studio to create the next generation of low-power, high-performance audio interfaces.
The Edge Impulse firmware for this development board is open source and hosted on .
To set this device up in Edge Impulse, you will need to install the following software:
Download the Nicla Voice firmware for audio or IMU below and connect the USB cable to your computer:
The archive contains different scripts to flash the firmware on your OS, ie for macOS:
install_lib_mac.command: script will install the Arduino Core for the Nicla board and the pyserial package required to update the NDP120 chip. You only need to run this script once.
flash_mac.command: to flash both the MCU and NDP120 chip. You should use this script on a brand new board
The additional scripts below can be used for specific actions:
flash_mac_mcu.command: to flash only the Nordic MCU, ie if you recompiled the firmware and doesn't need to update the NDP120 model.
flash_mac_model.command: to flash only the NDP120 model.
format_mac_ext_flash.command: to format the external flash that contains the NDP120 model
After flashing the MCU and NDP chips, connect the Nicla Voice directly to your computer's USB port. Linux, Mac OS, and Windows platforms are supported. From a command prompt or terminal, run:
This will start a wizard which will ask you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Use syntiant compatible pre-processing blocks
The Arduino Nicla Voice is based on the Syntiant NDP120 Neural Decision Processor™ and needs to use dedicated Syntiant DSP blocks.
With everything set up you can now build your first machine learning model and evaluate it using the Arduino Nicla Voice Board with this tutorial:
How to label my classes? The NDP chip expects one and only negative class and it should be the last in the list. For instance, if your original dataset looks like: yes, no, unknown, noise
and you only want to detect the keyword 'yes' and 'no', merge the 'unknown' and 'noise' labels in a single class such as z_openset
(we prefix it with 'z' in order to get this class last in the list).
If you get quarantine warnings on MacOS when flashing the device try this command to unquarantine the files and then rerun the flashing command
is a tiny (25mm x 30mm), ultra-low power, edge AI/ML board, based on a , a Renesas RA6M4 host MCU plus a power efficient DA16600 Wi-Fi/BT combo module. The NDP120 subsystem with on-board digital microphone, IMU motion sensor and SPI Flash memory, achieves highly efficient processing of acoustic and motion events. Battery and USB-C device connectors facilitate standalone use, while a compact under-board connector enables integration with custom OEM boards and additional sensors.
An IO board (50mm x 30mm) is included for implementation of a compact two-board evaluation kit assembly. This pins-out a subset of the NDP120 and RA6M4 I/Os to popular Pmod, Click header and expansion header footprints, enabling connection with additional external microphones and sensor options. An onboard debugger MCU (SWD and UART interfaces), button switches, RGB LED and removable MicroSD storage, further maximize prototyping versatility and utility.
NDP120 AI/ML models for popular use-cases (pre-engineered by and other vendors) are loaded from local SPI Flash storage for efficient execution on the ultra-low power NDP120 neural accelerator device.
RA6M4 MCU application software development and debug is supported via the Renesas e2 Studio IDE, interfaced via the E2OB debugger MCU on the IO board. Key Features
Accelerated Edge-AI and ML applications
Battery-powered remote sensor systems
Industrial smart sensors
Motor predictive maintenance
Always-on speech recognition and sensor fusion processing
Getting Started Guides may be found at Avnet's Github repositories:
To set this device up in Edge Impulse, you will need to install the following software. The Renesas software will require registration for a Renesas account.
Download the Edge Impulse RASynBoard firmware for audio or IMU below and connect the USB cable to your computer:
Follow the instructions in the .zip's for installation instructions
This will start a wizard which will ask you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Use syntiant compatible pre-processing blocks
The Avnet RASynBoard is based on the Syntiant NDP120 Neural Decision Processor™ and needs to use dedicated Syntiant DSP blocks.
With everything set up you can now build your first machine learning model and evaluate it using one of these tutorials:
How to label my classes? The NDP chip expects one and only negative class and it should be the last in the list. For instance, if your original dataset looks like: yes, no, unknown, noise
and you only want to detect the keyword 'yes' and 'no', merge the 'unknown' and 'noise' labels in a single class such as z_openset
(we prefix it with 'z' in order to get this class last in the list).
The Silicon Labs xG24 Dev Kit (xG24-DK2601B) is a compact, feature-packed development platform built for the EFR32MG24 Cortex-M33 microcontroller. It provides the fastest path to develop and prototype wireless IoT products. This development platform supports up to +10 dBm output power and includes support for the 20-bit ADC as well as the xG24's AI/ML hardware accelerator. The platform also features a wide variety of sensors, a microphone, Bluetooth Low Energy and a battery holder - and it's fully supported by Edge Impulse! You'll be able to sample raw data as well as build and deploy trained machine learning models directly from the Edge Impulse Studio - and even stream your machine learning results over BLE to a phone.
The Edge Impulse firmware for this development board is open source and hosted on GitHub: .
To set this device up with Edge Impulse, you will need to install the following software:
Problems installing the CLI?
Edge Impulse Studio can collect data directly from your xG24 Dev Kit and also help you trigger in-system inferences to debug your model, but in order to allow Edge Impulse Studio to interact with your xG24 Dev Kit you first need to flash it with our base firmware image.
Then go to the "Flash" section on the left sidebar, and select the base firmware image file you downloaded in the first step above (i.e., the file named firmware-xg24.hex
). You can now press the Flash
button to load the base firmware image onto the xG24 Dev Kit.
Keep Simplicity Commander Handy
Simplicity Commander will be needed to upload any other project built on Edge Impulse, but the base firmware image only has to be loaded once.
With all the software in place, it's time to connect the xG24 Dev Kit to Edge Impulse.
Use a micro-USB cable to connect the development board to your computer.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Our firmware is equipped with a simple BLE demo showing how to start/stop the inference over the BLE and acquire the results.
To use the demo, first install the EFR Connect BLE Mobile App on your mobile phone:
Scan your neighborhood for BLE devices
Look for the device named Edge Impulse and tap Connect
Scroll down to Unknown service with UUID DDA4D145-FC52-4705-BB93-DD1F295AA522
and select More Info
Select Write for characteristics with UUID 02AA6D7D-23B4-4C84-AF76-98A7699F7FE2
In the Hex field enter 01
and press Send. This will start inferencing, the device should start blinking with LEDs.
For another characteristic with UUID 61A885A4-41C3-60D0-9A53-6D652A70D29C
enable Notify and observe the reported inference results.
To stop the inference, send 00
to the characteristics 02AA6D7D-23B4-4C84-AF76-98A7699F7FE2
With everything set up you can now build your first machine learning model with these tutorials:
Firmware source code:
Pre-compiled firmware:
.
Download the latest
See the guide.
(tinyuf2-grove_vision_ai_vX.X.X.bin)
and extract it to obtain firmware.uf2 file
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
After building the machine learning model and downloading the Edge Impulse firmware from Edge Impulse Studio, deploy the model uf2 to Grove - Vision AI by following steps 1 and 2 under .
If you want to compile the Edge Impulse firmware from the source code, you can visit and follow the instructions included in the README.
The model used for the official firmware can be found in this .
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
How to use Arduino-CLI with macOS M1 chip? You will need to install Rosetta2 to run the Arduino-CLI. See details on .
After flashing the MCU per the instructions, please reconnect the Avnet RASynBoard directly to your computer's USB port. Linux, Mac OS, and Windows platforms are supported. From a command prompt or terminal, run:
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
RFP Error(E3000107): This device does not match the connection parameters If you encounter this error while programming the Renesas device on the RASynBoard please follow this .
. A utility program we will use to flash firmware images onto the target.
The which will enable you to connect your xG24 Dev Kit directly to Edge Impulse Studio, so that you can collect raw data and trigger in-system inferences.
See the guide.
, and unzip the file. Once downloaded, unzip it to obtain the firmware-xg24.hex
file, which we will be using in the following steps.
Use a micro-USB cable to connect the xG24 Dev Kit to your development computer (where you downloaded and installed ).
You can use to flash your xG24 Dev Kit with our . To do this, first select your board from the dropdown list on the top left corner:
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to , and click Devices on the left sidebar. The device will be listed there:
Make sure your board is flashed with . Power on the board and run the EFR Connect BLE Mobile App
.
.
.
.
Looking to connect different sensors? The lets you easily send data from any sensor into Edge Impulse.
The Ensemble series of fusion processors from Alif Semiconductor utilize ARM's low power Cortex-M55 CPUs with dedicated Ethos-U55 microNPUs to run embedded ML workloads quickly and efficiently. The devices feature both 'High Power' cores designed for large model architectures, as well as 'High Efficiency' cores designed for low power continuous monitoring. The Development kit and AppKit are both fully supported by Edge Impulse. The Ensemble kits feature multiple core types, dual MEMS microphones, accelerometers, and a MIPI camera interface.
To get started with the Alif Ensemble and Edge Impulse you'll need the following hardware:
To set this device up in Edge Impulse, you will need to install the following software:
The latest Alif Security Toolkit
:
Navigate to the Alif Semiconductor Kit documentation page (you will need to register to create an account with Alif, or log in to your existing Alif account). and download the latest App Security Toolkit (tested with version 0.56.0) for windows or linux.
Extract the archive, and read through the included Security Toolkit Quick Start Guide
to finalize the installation
(Optional) Docker Desktop:
If you are using MacOS, we recommended installing Docker Desktop in order to use the Alif Security Toolkit for programming.
Once you have installed it's time to connect the development board to Edge Impulse.
To interface the Alif Ensemble AppKit or Development Kit, you'll need to connect your device to the USB port label PRG USB
You can program and use serial port of the device if you adjust jumper J15 to connect pins 1-3 and 2-4. There will be two serial ports enumerated. The first port is used for programming, the second for serial communication.
After configuring the hardware, the next step is to flash the default Edge Impulse Firmware. This will allow us to collect data directly from your Ensemble device. To update the firmware:
Download the latest Edge Impulse firmware binary and unzip the file.
Navigate to the directory where you installed the Alif Security Toolkit
Copy the .bin
files from the Edge Impulse firmware directory into the build/images
directory of the Alif Security Toolkit
Copy all .json
files from the Edge Impulse firmware directory into the build/config
directory of the Alif Security Toolkit
Inspect the json files and paths closely to ensure the file names and paths are correctly specified for the binary you intend to flash. Inspect isp_config_data.cfg
to ensure the COM port is set correctly to the device attached to your computer There will be two serial ports enumerated. The first port is used for programming, the second for serial communication.
From a command prompt or terminal, run the following commands from the root of the Alif Security Toolkit
folder:
MACOS or LINUX
WINDOWS
To use the serial port of the device adjust jumper J15 to connect pins 1-3 and 2-4
Now, the Ensemble device can connect to the Edge Impulse CLI
installed earlier. To test the CLI for the first time, either:
Create a new project from the Edge Impulse project dashboard
OR
Clone an existing Edge Impulse public project, like this Face Detection Demo. Click the link and then press Clone
at the top right of the public project.
Then, from a command prompt or terminal on your computer, run:
Device choice
You may see two FTDI
or CYPRESS
serial ports enumerated for devices. If so, select the second entry in the list, which generally is the serial data connection to the Ensemble device. Ensure that the jumpers are correctly oriented for serial communication.
This will start a wizard which will ask you to log in and choose an Edge Impulse project. You should see your new or cloned project listed on the command line. Use the arrow keys and hit Enter
to select your project.
That's all! Your device is now connected to Edge Impulse. To verify this, go to
With everything set up you can now build your first machine learning model with these tutorials. This will walk you through the process of collecting data and training a new ML model:
Alternatively, you can test on-device inference with a demo model included in the base firmware binary. To do this, you may run the following command from your terminal:
Then, once you've tested out training and deployment with the Edge Impulse Firmware, learn how to integrate impulses with your own custom Ensemble based application:
The Synaptics Katana KA10000 board is a low-power AI evaluation kit from Synaptics that has the KA10000 AI Neural Network processor onboard. The evaluation kit is provided with a separate Himax HM01B0 QVGA monochrome camera module and 2 onboard zero power Vesper microphones. The board has an embedded STLIS2Dw12 accelerometer and an optional TI OPT3001 ambient light sensor. The connectivity to the board is provided with an IEEE 802.11n ultra low power WiFi module that is integrated with a Bluetooth 5.x, in addition to 4 Peripheral Modules (PMOD) connectors to provide I2C. UART, GPIO, I2S/SPI interfaces.
The package contains several accessories:
The Himax image sensor.
The PMOD-I2C USB firmware configuration board.
The PMOD-UART USB adapter.
2 AAA batteries
Enclosure.
The Edge Impulse firmware for this board is open source and hosted on GitHub: edgeimpulse/firmware-synaptics-ka10000.
To set this device up in Edge Impulse, you will need to install the following software:
On Linux:
GNU Screen: install for example via sudo apt install screen
.
In order to update the firmware, it is necessary to use the PMOD-I2C USB firmware configuration board. The PMOD-I2C board is connected to the Katana board on the north right PMOD-I2C interface (as shown in the image at the top of this page), then you need to use a USB C cable to connect the firmware configuration board to the host PC.
In addition to the PMOD-I2C configuration board. You need to connect the PMOD-UART extension to the Katana board which is located on the left side of the board. Then you need to use a Micro-USB cable to connect the board to your computer.
The board is shipped originally with a sound detection firmware by default. You can upload new firmware to the flash memory by following these instructions:
Download the latest Edge Impulse firmware, and unzip the file.
Verify that you have correctly connected the firmware configuration board.
Run the flash script for your operating system (flash_windows.bat
, flash_mac.command
or flash_linux.sh
) to flash the firmware.
Wait until flashing is complete.
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with these tutorials, and board-specific public projects:
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
The Syntiant TinyML Board is a tiny development board with a microphone and accelerometer, USB host microcontroller and an always-on Neural Decision Processor™, featuring ultra low-power consumption, a fully connected neural network architecture, and fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained embedded machine learning models directly from the Edge Impulse studio to create the next generation of low-power, high-performance audio interfaces.
The Edge Impulse firmware for this development board is open source and hosted on GitHub.
IMU data acquisition - SD Card
An SD Card is required to use IMU data acquisition as the internal RAM of the MCU is too small. You don't need the SD Card for inferencing only or for audio projects.
To set this device up in Edge Impulse, you will need to install the following software:
Select one of the 2 firmwares below for audio or IMU projects:
Insert SD Card if you need IMU data acquisition and connect the USB cable to your computer. Double-click on the script for your OS. The script will flash the Arduino firmware and a default model on the NDP101 chip.
Flashing issues
0x000000: read 0x04 != expected 0x01
Some flashing issues can occur on the Serial Flash. In this case, open a Serial Terminal on the TinyML board and send the command: :F. This will erase the Serial Flash and should fix the flashing issue.
Connect the Syntiant TinyML Board directly to your computer's USB port. Linux, Mac OS, and Windows 10 platforms are supported.
Audio - USB microphone (macOS/Linux only)
Check that the Syntiant TinyML enumerates as "TinyML" or "Arduino MKRZero". For example, in Mac OS you'll find it under System Preferences/Sound:
Audio acquisition - Windows OS
Using the Syntiant TinyML board as an external microphone for data collection doesn't currently work on Windows OS.
IMU
From a command prompt or terminal, run:
This will start a wizard which will ask you to log in and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
Alternatively, recent versions of Google Chrome and Microsoft Edge can collect data directly from your development board, without the need for the Edge Impulse CLI. See this blog post for more information.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model and evaluate it using the Syntiant TinyML Board with this tutorial:
How to use Arduino-CLI with macOS M1 chip? You will need to install Rosetta2 to run the Arduino-CLI. See details on Apple website.
How to label my classes? The NDP101 chip expects one and only negative class and it should be the last in the list. For instance, if your original dataset looks like: yes, no, unknown, noise
and you only want to detect the keyword 'yes' and 'no', merge the 'unknown' and 'noise' labels in a single class such as z_openset
(we prefix it with 'z' in order to get this class last in the list).