Raspberry Pi 4
The Raspberry Pi 4 is a versatile Linux development board with a quad-core processor running at 1.5GHz, a GPIO header to connect sensors, and the ability to easily add an external microphone or camera - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the Studio. The Raspberry Pi 4 is available from 35 USD from a wide range of distributors, including DigiKey.
In addition to the Raspberry Pi 4 we recommend that you also add a camera and / or a microphone. Most popular USB webcams and the Camera Module work fine on the development board out of the box.
1. Connecting to your Raspberry Pi
Headless
You can set up your Raspberry Pi without a screen. To do so:
Raspberry Pi OS - Bullseye release
Last release of the Raspberry Pi OS requires Edge Impulse Linux CLI version >= 1.3.0.
Flash the Raspberry Pi OS image to an SD card.
After flashing the OS, find the
boot
mass-storage device on your computer, and create a new file called wpa_supplicant.conf in theboot
drive. Add the following code:
(Replace the fields marked with <>
with your WiFi credentials)
Next, create a new file called
ssh
into theboot
drive. You can leave this file empty.Plug the SD card into your Raspberry Pi 4, and let the device boot up.
Find the IP address of your Raspberry Pi. You can either do this through the DHCP logs in your router, or by scanning your network. E.g. on macOS and Linux via:
Here 192.168.1.19
is your IP address.
Connect to the Raspberry Pi over SSH. Open a terminal window and run:
Log in with password
raspberry
.
With a screen
If you have a screen and a keyboard / mouse attached to your Raspberry Pi:
Flash the Raspberry Pi OS image to an SD card.
Plug the SD card into your Raspberry Pi 4, and let the device boot up.
Connect to your WiFi network.
Click the 'Terminal' icon in the top bar of the Raspberry Pi.
2. Installing dependencies
To set this device up in Edge Impulse, run the following commands:
If you have a Raspberry Pi Camera Module, you also need to activate it first. Run the following command:
Use the cursor keys to select and open Interfacing Options, and then select Camera and follow the prompt to enable the camera. Then reboot the Raspberry.
Install with Docker
If you want to install Edge Impulse on your Raspberry Pi using Docker you can run the following commands:
Once on the Docker container, run:
and
You should now be able to run Edge Impulse CLI tools from the container running on your Raspberry.
Note that this will only work using an external USB camera
3. Connecting to Edge Impulse
With all software set up, connect your camera or microphone to your Raspberry Pi (see 'Next steps' further on this page if you want to connect a different sensor), and run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
4. Verifying that your device is connected
That's all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
Next steps: building a machine learning model
With everything set up you can now build your first machine learning model with these tutorials:
Looking to connect different sensors? Our Linux SDK lets you easily send data from any sensor and any programming language (with examples in Node.js, Python, Go and C++) into Edge Impulse.
Deploying back to device
To run your impulse locally, just connect to your Raspberry Pi again, and run:
This will automatically compile your model with full hardware acceleration, download the model to your Raspberry Pi, and then start classifying. Our Linux SDK has examples on how to integrate the model with your favourite programming language.
Image model?
If you have an image model then you can get a peek of what your device sees by being on the same network as your device, and finding the 'Want to see a feed of the camera and live classification in your browser' message in the console. Open the URL in a browser and both the camera feed and the classification are shown:
Last updated