Parking Zone
Vision-based Parking System
Hardware
Taking photos
Upload data
Labeling
Train/Test data split
Learning blocks
Save parameters
Generate features
NN setting & result
Model test
SSH
from your PC/laptop then simply type edge-impulse-linux-runner
(you can add --clean
to allow you to select your project if you’ve tried a different project in the past). Log in to your account then choose your project. This process will download the model.eim
, which is specifically built for the aarch64 architecture (Pi 5 ARM64). During the process, the console will display the path where the model.eim
has been downloaded. For example, in the image below, it shows the file located at /home/pi/.ei-linux-runner/models/624749/v5
Edge Impulse Runner
cp -v model.eim /home/pi
Now the model is ready to run in a high-level language such as Python. To ensure this model works, we can re-run the EI Runner with a camera attached to the Raspberry Pi. You can see the camera feed and inference in a browser, at the local IP address of the Pi on port 4912. Run this command once again: edge-impulse-linux-runner
Live inferencing
Code Screenshot
parkingmeter1.py
) with the following command:
python3 parkingmeter1.py <path to modelfile>/model.eim
To run the program (parkingmeter2.py
) using a video file as input (e.g., video.mp4), we can add the path to the video file when executing the program:
python3 parkingmeter2.py <path to modelfile>/model.eim <path to videofile>/video.mp4
Note: For video/camera capture display, you cannot use a headless method from a PC/laptop. Instead, connect a monitor directly to the Raspberry Pi to view the visuals.Check out our demo video: