Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
You can use your Linux x86_64 device or computer as a fully-supported development environment for Edge Impulse for Linux. This lets you sample raw data, build models, and deploy trained machine learning models directly from the Studio. If you have a webcam and microphone plugged into your system, they are automatically detected and can be used to build models.
Instruction set architectures
If you are not sure about your instruction set architectures, use:
To set this device up in Edge Impulse, run the following commands:
Ubuntu/Debian:
Important: Edge Impulse requires Node.js version 20.x or later. Using older versions may lead to installation issues or runtime errors. Please ensure you have the correct version installed before proceeding with the setup.
With all software set up, connect your camera or microphone to your operating system (see 'Next steps' further on this page if you want to connect a different sensor), and run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
That's all! Your machine is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with these tutorials:
Counting objects using FOMO Looking to connect different sensors? Our [Linux SDK]/tools/edge-impulse-for-linux/) lets you easily send data from any sensor and any programming language (with examples in Node.js, Python, Go and C++) into Edge Impulse.
To run your impulse locally run on your Linux platform:
This will automatically compile your model with full hardware acceleration, download the model to your local machine, and then start classifying. Our Linux SDK has examples on how to integrate the model with your favourite programming language.
If you have an image model then you can get a peek of what your device sees by being on the same network as your device, and finding the 'Want to see a feed of the camera and live classification in your browser' message in the console. Open the URL in a browser and both the camera feed and the classification are shown:
The Raspberry Pi 4 is a versatile Linux development board with a quad-core processor running at 1.5GHz, a GPIO header to connect sensors, and the ability to easily add an external microphone or camera - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the Studio.
In addition to the Raspberry Pi 4 we recommend that you also add a camera and / or a microphone. Most popular USB webcams and the Camera Module work fine on the development board out of the box.
To set this device up in Edge Impulse, run the following commands:
Important: Edge Impulse requires Node.js version 20.x or later. Using older versions may lead to installation issues or runtime errors. Please ensure you have the correct version installed before proceeding with the setup.
If you have a Raspberry Pi Camera Module, you also need to activate it first. Run the following command:
Use the cursor keys to select and open Interfacing Options, and then select Camera and follow the prompt to enable the camera. Then reboot the Raspberry.
If you want to install Edge Impulse on your Raspberry Pi using Docker you can run the following commands:
Once on the Docker container, run:
and
You should now be able to run Edge Impulse CLI tools from the container running on your Raspberry.
Note that this will only work using an external USB camera
With all software set up, connect your camera or microphone to your Raspberry Pi (see 'Next steps' further on this page if you want to connect a different sensor), and run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? Our Linux SDK lets you easily send data from any sensor and any programming language (with examples in Node.js, Python, Go and C++) into Edge Impulse.
To run your impulse locally, just connect to your Raspberry Pi again, and run:
This will automatically compile your model with full hardware acceleration, download the model to your Raspberry Pi, and then start classifying. Our Linux SDK has examples on how to integrate the model with your favourite programming language.
If you have an image model then you can get a peek of what your device sees by being on the same network as your device, and finding the 'Want to see a feed of the camera and live classification in your browser' message in the console. Open the URL in a browser and both the camera feed and the classification are shown:
If you see the following error when trying to deploy a .eim model to your Raspberry Pi:
It likely means you are attempting to deploy a .eim Edge Impulse model file to a 32-bit operating system running on a 64-bit CPU. To check your hardware architecture and OS in Linux, please run the following commands:
If you see something like this:
It means that you are running a 32-bit OS on a 64-bit CPU. To run .eim models on aarch64 CPUs, you must use a 64-bit operating system. Please download and install the 64-bit version of Raspberry Pi OS if you see aarch64
when you run uname -m
.
The Raspberry Pi 5 with 2–3× the speed of the previous generation, and featuring silicon designed in‑house for the best possible performance, we’ve redefined the Raspberry Pi experience. The Pi5 is a versatile Linux development board with a quad-core processor running at 2.4GHz a GPIO header to connect sensors, and the ability to easily add an external microphone or camera - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the Studio.
In addition to the Raspberry Pi 5 we recommend that you also add a camera and / or a microphone. Most popular USB webcams and the Camera Module work fine on the development board out of the box.
In this documentation, we will detail the steps to set up your Raspberry Pi 5 with the new Bookworm release OS for Edge Impulse. This guide includes headless setup instructions and how to connect to Edge Impulse, along with troubleshooting tips.
You can set up your Raspberry Pi without a screen. To do so:
Download the Raspberry Pi OS - Bookworm Release
Ensure you have the latest Raspberry Pi OS which supports the new Edge Impulse Linux CLI version >= 1.3.0.
You must use 64-bit OS as 32-bit OS is no longer supported Raspberry Pi 5 uses aarch64, which is a 64-bit CPU. If you are installing Raspberry Pi OS for the RPi 5, make sure you use the 64-bit version. Raspberry Pi 5 cannot run armv7 images
Flash the Raspberry Pi OS Image
Flash the OS image to an SD card using a tool like Balena Etcher.
Prepare the SD Card
When flashing the OS image, access the advanced options menu in the Raspberry Pi Imager to preconfigure your WiFi and enable SSH.