The Raspberry Pi 5 with 2–3× the speed of the previous generation, and featuring silicon designed in‑house for the best possible performance, we’ve redefined the Raspberry Pi experience. The Pi5 is a versatile Linux development board with a quad-core processor running at 2.4GHz a GPIO header to connect sensors, and the ability to easily add an external microphone or camera - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the Studio.
In addition to the Raspberry Pi 5 we recommend that you also add a camera and / or a microphone. Most popular USB webcams and the Camera Module work fine on the development board out of the box.
In this documentation, we will detail the steps to set up your Raspberry Pi 5 with the new Bookworm release OS for Edge Impulse. This guide includes headless setup instructions and how to connect to Edge Impulse, along with troubleshooting tips.
You must use 64-bit OS with _aarch64** and 32-bit OS is not supported**
Raspberry Pi 5 uses aarch64, which is a 64-bit CPU. If you are installing Raspberry Pi OS for the RPi 5, make sure you use the 64-bit version.
You can set up your Raspberry Pi without a screen. To do so:
Download the Raspberry Pi OS - Bookworm Release
Ensure you have the latest Raspberry Pi OS which supports the new Edge Impulse Linux CLI version >= 1.3.0.
Flash the Raspberry Pi OS Image
Flash the OS image to an SD card using a tool like Balena Etcher.
Prepare the SD Card
After flashing, locate the boot mass-storage device on your computer.
Create a new file called wpa_supplicant.conf
in the boot drive with the following content:
Replace the placeholders with your WiFi credentials.
Create an empty file called ssh
in the boot drive to enable SSH.
Boot the Raspberry Pi
Insert the SD card into your Raspberry Pi 5 and power it on.
Find the IP Address
Locate the IP address of your Raspberry Pi using your router's DHCP logs or a network scanning tool. On macOS or Linux, use:
This will display the IP address, e.g., 192.168.1.19
.
Connect via SSH
Open a terminal and connect to the Raspberry Pi:
Log in with the default password raspberry
.
If you have a screen and a keyboard/mouse attached to your Raspberry Pi:
Flash the Raspberry Pi OS Image
Flash the OS image to an SD card as described above.
Boot the Raspberry Pi
Insert the SD card into your Raspberry Pi 5 and power it on.
Connect to WiFi
Use the graphical interface to connect to your WiFi network.
Open a Terminal
Click the 'Terminal' icon in the top bar of the Raspberry Pi desktop.
To set this device up in Edge Impulse, run the following commands:
Then to update npm packages:
If you have a Raspberry Pi Camera Module, you also need to activate it first. Run the following command:
Use the cursor keys to select and open Interfacing Options, then select Camera, and follow the prompt to enable the camera. Reboot the Raspberry Pi.
If you want to install Edge Impulse on your Raspberry Pi using Docker, run the following commands:
Once in the Docker container, run:
With all software set up, connect your camera or microphone to your Raspberry Pi (see 'Next steps' further on this page if you want to connect a different sensor).
To connect your Raspberry Pi 5 to Edge Impulse, run the following command:
You can now sample raw data, build models, and deploy trained machine learning models directly from the Studio. Please let us know if you have any questions or need further assistance. forum.edgeimpulse.com
Wrong OS bits
If you see the following error when trying to deploy a .eim model to your Raspberry Pi:
It likely means you are attempting to deploy a .eim Edge Impulse model file to a 32-bit operating system running on a 64-bit CPU. To check your hardware architecture and OS in Linux, please run the following commands:
You can use your Intel or M1-based Mac computer as a fully-supported development environment for Edge Impulse for Linux. This lets you sample raw data, build models, and deploy trained machine learning models directly from the Studio. If you have a Macbook, the webcam and microphone of your system are automatically detected and can be used to build models.
To connect your Mac to Edge Impulse:
Last, install the Edge Impulse CLI:
Problems installing the CLI?
See the Installation and troubleshooting guide.
With the software installed, open a terminal window and run::
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
That's all! Your Mac is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? Our Linux SDK lets you easily send data from any sensor and any programming language (with examples in Node.js, Python, Go and C++) into Edge Impulse.
To run your impulse locally, just open a terminal and run:
This will automatically compile your model with full hardware acceleration, download the model to your Raspberry Pi, and then start classifying. Our Linux SDK has examples on how to integrate the model with your favourite programming language.
If you have an image model then you can get a peek of what your device sees by being on the same network as your device, and finding the 'Want to see a feed of the camera and live classification in your browser' message in the console. Open the URL in a browser and both the camera feed and the classification are shown:
You can use your Linux x86_64 device or computer as a fully-supported development environment for Edge Impulse for Linux. This lets you sample raw data, build models, and deploy trained machine learning models directly from the Studio. If you have a webcam and microphone plugged into your system, they are automatically detected and can be used to build models.
Instruction set architectures
If you are not sure about your instruction set architectures, use:
To set this device up in Edge Impulse, run the following commands:
Ubuntu/Debian:
With all software set up, connect your camera or microphone to your operating system (see 'Next steps' further on this page if you want to connect a different sensor), and run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
That's all! Your machine is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with these tutorials:
Counting objects using FOMO Looking to connect different sensors? Our [Linux SDK]/tools/edge-impulse-for-linux/) lets you easily send data from any sensor and any programming language (with examples in Node.js, Python, Go and C++) into Edge Impulse.
To run your impulse locally run on your Linux platform:
This will automatically compile your model with full hardware acceleration, download the model to your local machine, and then start classifying. Our Linux SDK has examples on how to integrate the model with your favourite programming language.
If you have an image model then you can get a peek of what your device sees by being on the same network as your device, and finding the 'Want to see a feed of the camera and live classification in your browser' message in the console. Open the URL in a browser and both the camera feed and the classification are shown:
The Raspberry Pi 4 is a versatile Linux development board with a quad-core processor running at 1.5GHz, a GPIO header to connect sensors, and the ability to easily add an external microphone or camera - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the Studio.
In addition to the Raspberry Pi 4 we recommend that you also add a camera and / or a microphone. Most popular USB webcams and the Camera Module work fine on the development board out of the box.
You can set up your Raspberry Pi without a screen. To do so:
Raspberry Pi OS - Bullseye release
Last release of the Raspberry Pi OS requires Edge Impulse Linux CLI version >= 1.3.0.
Flash the Raspberry Pi OS image to an SD card.
You must use 64-bit OS with _aarch64** and 32-bit OS with **armv7l_
Raspberry Pi 4 uses aarch64, which is a 64-bit CPU. If you are installing Raspberry Pi OS for the RPi 4, make sure you use the 64-bit version.
After flashing the OS, find the boot
mass-storage device on your computer, and create a new file called wpa_supplicant.conf in the boot
drive. Add the following code:
(Replace the fields marked with <>
with your WiFi credentials)
Next, create a new file called ssh
into the boot
drive. You can leave this file empty.
Plug the SD card into your Raspberry Pi 4, and let the device boot up.
Find the IP address of your Raspberry Pi. You can either do this through the DHCP logs in your router, or by scanning your network. E.g. on macOS and Linux via:
Here 192.168.1.19
is your IP address.
Connect to the Raspberry Pi over SSH. Open a terminal window and run:
Log in with password raspberry
.
If you have a screen and a keyboard / mouse attached to your Raspberry Pi:
Flash the Raspberry Pi OS image to an SD card.
Plug the SD card into your Raspberry Pi 4, and let the device boot up.
Connect to your WiFi network.
Click the 'Terminal' icon in the top bar of the Raspberry Pi.
To set this device up in Edge Impulse, run the following commands:
If you have a Raspberry Pi Camera Module, you also need to activate it first. Run the following command:
Use the cursor keys to select and open Interfacing Options, and then select Camera and follow the prompt to enable the camera. Then reboot the Raspberry.
If you want to install Edge Impulse on your Raspberry Pi using Docker you can run the following commands:
Once on the Docker container, run:
and
You should now be able to run Edge Impulse CLI tools from the container running on your Raspberry.
Note that this will only work using an external USB camera
With all software set up, connect your camera or microphone to your Raspberry Pi (see 'Next steps' further on this page if you want to connect a different sensor), and run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? Our Linux SDK lets you easily send data from any sensor and any programming language (with examples in Node.js, Python, Go and C++) into Edge Impulse.
To run your impulse locally, just connect to your Raspberry Pi again, and run:
This will automatically compile your model with full hardware acceleration, download the model to your Raspberry Pi, and then start classifying. Our Linux SDK has examples on how to integrate the model with your favourite programming language.
If you have an image model then you can get a peek of what your device sees by being on the same network as your device, and finding the 'Want to see a feed of the camera and live classification in your browser' message in the console. Open the URL in a browser and both the camera feed and the classification are shown:
If you see the following error when trying to deploy a .eim model to your Raspberry Pi:
It likely means you are attempting to deploy a .eim Edge Impulse model file to a 32-bit operating system running on a 64-bit CPU. To check your hardware architecture and OS in Linux, please run the following commands:
If you see something like this:
It means that you are running a 32-bit OS on a 64-bit CPU. To run .eim models on aarch64 CPUs, you must use a 64-bit operating system. Please download and install the 64-bit version of Raspberry Pi OS if you see aarch64
when you run uname -m
.
The AM62X Starter Kit from Texas Instruments is a development platform for the AM62X with quad-core Arm A53s at 1.4GHz. This general purpose microprocessor supports 1080p displays through HDMI, 5MP camera input through MIPI-CSI2, including the Raspberry Pi cam support, and multichannel audio. The Linux distribution for this device comes with Tensorflow-lite, ONNX Runtime, OpenCV, and gstreamer, all with python bindings and C++ libraries.
Please visit Texas Instruments' website for instructions on how to use this board with Edge Impulse.