Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Community board
This is a community board by RAKwireless and is not maintained by Edge Impulse. For support, head to the RAKwireless homepage or the RAKwireless forums.
The RAKwireless WisBlock is a modular development system that lets you combine different cores and sensors to easily construct your next Internet of Things (IoT) device. The following WisBlock cores work with Edge Impulse:
RAK11200 (ESP32)
RAK4631 (nRF52840)
RAK11310 (RP2040)
RAKwireless has created an in-depth tutorial on how to get started using the WisBlock with Edge Impulse, including collecting raw data from a 3-axis accelerometer or a microphone, training a machine learning, and deploying the model to the WisBlock core.
A WisBlock starter kit can be found in the RAKwireless store.
Install the following software:
Follow the guide for your particular core to collect data, train a machine learning model, and deploy it to your WisBlock:
By the end of the guide, you should have machine learning inference running locally on your WisBlock!
Community board
This is a community board by Blues Wireless, and is not maintained by Edge Impulse. For support head to the Blues Wireless homepage.
The Blues Wireless Swan is a development board featuring a 120MHz ARM Cortex-M4 from STMicroelectronics with 2MB of flash and 640KB of RAM. Blues Wireless has created an in-depth tutorial on how to get started using the Swan with Edge Impulse, including how to collect new data from a triple axis accelerometer and how to train and deploy your Edge Impulse models to the Swan. For more details and ordering information, visit the Blues Wireless Swan product page.
To set up your Blues Wireless Swan, follow this complete guide: Using Swan with Edge Impulse.
The Blues Wireless Swan tutorial will guide you through how to create a simple classification model with an accelerometer designed to analyze movement over a brief period of time (2 seconds) and infer how the motion correlates to one of the following four states:
Idle (no motion)
Circle
Slash
An up-and-down motion in the shape of the letter "W"
For more insight into using a triple axis accelerometer to build an embedded machine learning model visit the Edge Impulse continuous motion recognition tutorial.
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
With the impulse designed, trained and verified you can deploy this model back to your Blues Wireless Swan. This makes the model run without an internet connection, minimizes latency, and runs with minimum power consumption. Edge Impulse can package the complete impulse - including the signal processing code, neural network weights, and classification code - up into a single library that you can run on your development board. See the end of the Blues Wireless' [Using Swan with Edge Impulse] (https://dev.blues.io/swan/using-swan-with-edge-impulse) tutorial for more information on deploying your model onto the device.
Community board
This is a community board by Seeed Studios, and it's not maintained by Edge Impulse. For support head to the Seeed Forum.
The Seeed Wio Terminal is a development board from Seeed Studios with a Cortex-M4 microcontroller, motion sensors, an LCD display, and Grove connectors to easily connect external sensors. Seeed Studio has added support for this development board to Edge Impulse, so you can sample raw data and build machine learning models from the studio. The board is available for 29 USD directly from Seeed.
To set up your Seeed Wio Terminal, follow this guide: Wio Terminal Edge Impulse Getting Started - Seeed Wiki.
With everything set up you can now build your first machine learning model with this full end-to-end course from Seeed's EDU team: TinyML with Wio Terminal Course.
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
With the impulse designed, trained and verified you can deploy this model back to your Wio Terminal. This makes the model run without an internet connection, minimizes latency, and runs with minimum power consumption. Edge Impulse can package up the complete impulse - including the signal processing code, neural network weights, and classification code - up in a single library that you can run on your development board.
The easiest way to deploy your impulse to the Seeed Wio Terminal is via an Arduino library. See Running your impulse locally on your Arduino for more information.
Community board
This is a community board by Arducam, and it's not maintained by Edge Impulse. For support head to the Arducam support page.
The Arducam Pico4ML TinyML Dev Kit is a development board from Arducam with a RP2040 microcontroller, QVGA camera, bluetooth module (depending on your version), LCD screen, onboard microphone, accelerometer, gyroscope, and compass. Arducam has created in depth tutorials on how to get started using the Pico4ML Dev Kit with Edge Impulse, including how to collect new data and how to train and deploy your Edge Impulse models to the Pico4ML. The Arducam Pico4ML TinyML Dev Kit has two versions, the version with BLE is available for 55 USD and the version without BLE is available for 50 USD.
To set up your Arducam Pico4ML TinyML Dev Kit, follow this guide: Arducam: How to use Edge Impulse to train machine learning models for Raspberry Pico.
With everything set up you can now build your first machine learning model with the Edge Impulse continuous motion recognition tutorial.
Or you can follow Arducam's tutorial on How to build a Magic Wand with Edge Impulse for Arducam Pico4ML-BLE.
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
With the impulse designed, trained and verified you can deploy this model back to your Arducam Pico4ML TinyML Dev Kit. This makes the model run without an internet connection, minimizes latency, and runs with minimum power consumption. Edge Impulse can package the complete impulse - including the signal processing code, neural network weights, and classification code - up into a single library that you can run on your development board. See the end of the Arducam's How to use Edge Impulse to train machine learning models for Raspberry Pico tutorial for more information on deploying your model onto the device.
Community board
This is a community board by Seeed Studios, and it's not maintained by Edge Impulse. For support head to the Seeed Forum.
The Seeed Studio XIAO nRF52840 Sense incorporates the Nordic nRF52840 chip with FPU, operating up to 64 MHz, mounted multiple development ports, carrying Bluetooth 5.0 wireless capability and is able to operate with low power consumption. Featuring onboard IMU and PDM, it can be your best tool for embedded Machine Learning projects. Seeed Studio has added support for this development board to Edge Impulse, so you can sample raw data and build machine learning models from the studio. The board is available for 15.99 USD directly from: Seeed.
To set up your Seeed Studio XIAO nRF52840 Sense, follow this guide: Seeed Studio XIAO nRF52840 Sense Edge Impulse Getting Started - Seeed Wiki.
With everything set up you can now build your first machine learning model: Building a machine learning model - Seeed Wiki.
Looking to connect different sensors? The Data forwarder lets you easily send data from any sensor into Edge Impulse.
With the impulse designed, trained and verified you can deploy this model back to your XIAO nRF52840 Sense. This makes the model run without an internet connection, minimizes latency, and runs with minimum power consumption. Edge Impulse can package up the complete impulse - including the signal processing code, neural network weights, and classification code - up in a single library that you can run on your development board.
The easiest way to deploy your impulse to the Seeed XIAO nRF52840 Sense is via an Arduino library. See Running your impulse locally on your Arduino for more information.
reComputer for Jetson series are compact edge computers built with NVIDIA advanced AI embedded systems: Jetson-10 (Nano) and Jetson-20 (Xavier NX). With rich extension modules, industrial peripherals, thermal management combined with decades of Seeed’s hardware expertise, reComputer for Jetson is ready to help you accelerate and scale the next-gen AI product emerging in diverse AI scenarios.
You can easily add a USB external microphone or camera - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the Studio. Currently, four versions have been launched. See reComputer Series Introduction web page.
This guide has only been tested with the reComputer J1020.
In addition to the Jetson Nano we recommend that you also add a camera and / or a microphone. Most popular USB webcams work fine on the development board out of the box.
You will also need the following equipment to complete your first boot.
A monitor with HDMI interface. (For the A206 carrier board, a DP interface monitor can also be used.)
A set of mouse and keyboard.
An ethernet cable or an external WiFi adapter (there is no WiFi on the Jetson)
The reComputer is shipped with the an operating system burned in. Before we use it, it is required to complete some necessary configuration steps: Follow reComputer Series Getting Started web page. When completed, open a new Terminal by pressing CTRL + Alt + T. It will look as shown:
Issue the following command to check:
The result should look similar to this:
To set this device up in Edge Impulse, run the following commands (from any folder). When prompted, enter the password you created for the user on your Jetson in step 1. The entire script takes a few minutes to run (using a fast microSD card).
With all software set up, connect your camera or microphone to your Jetson (see 'Next steps' further on this page if you want to connect a different sensor), and run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? Our Linux SDK lets you easily send data from any sensor and any programming language (with examples in Node.js, Python, Go and C++) into Edge Impulse.
To run your impulse locally, just connect to your Jetson again, and run:
This will automatically compile your model with full hardware acceleration, download the model to your Jetson, and then start classifying. Our Linux SDK has examples on how to integrate the model with your favourite programming language.
If you have an image model then you can get a peek of what your device sees by being on the same network as your device, and finding the 'Want to see a feed of the camera and live classification in your browser' message in the console. Open the URL in a browser and both the camera feed and the classification are shown:
Due to some incompatibilities we don't run models on the GPU by default. You can enable this by following the TensorRT instructions in the C++ SDK.
This is probably caused by a missing dependency on libjpeg. If you run:
The end of the output should show support for file import/export with libjpeg, like so:
If you don't see jpeg support as "yes", rerun the setup script and take note of any errors.
If you encounter this error, ensure that your entire home directory is owned by you (especially the .config folder):
By default, the Jetson Nano enables a number of aggressive power saving features to disable and slow down hardware that is detected to be not in use. Experience indicates that sometimes the GPU cannot power up fast enough, nor stay on long enough, to enjoy best performance. You can run a script to enable maximum performance on your Jetson Nano.
ONLY DO THIS IF YOU ARE POWERING YOUR JETSON NANO FROM A DEDICATED POWER SUPPLY. DO NOT RUN THIS SCRIPT WHILE POWERING YOUR JETSON NANO THROUGH USB.
To enable maximum performance, run:
Hackster.io tutorial: Train an embedded Machine Learning model based on Edge Impulse to detect hard hat and deploy it to the reComputer J1010 for Jetson Nano.
The is a development platform for the AM62X with quad-core Arm A53s at 1.4GHz. This general purpose microprocessor supports 1080p displays through HDMI, 5MP camera input through MIPI-CSI2, including the Raspberry Pi cam support, and multichannel audio. The Linux distribution for this device comes with Tensorflow-lite, ONNX Runtime, OpenCV, and gstreamer, all with python bindings and C++ libraries.
Please visit Texas Instruments' .
Product
SKU
110061362
110061361
110061363
110061401
Side View
Equipped Module
Jetson Nano 4GB
Jetson Nano 4GB
Jetson Xavier NX 8GB
Jetson Xavier NX 16GB
Operating carrier Board
J1010 Carrier Board
Jetson A206
Jetson A206
Jetson A206
Power Interface
Type-C connector
DC power adapter
DC power adapter
DC power adapter