edge-impulse-daemon
command:
edge-impulse-daemon
will ask our Edge Impulse Studio email and password. After this the BrickML should be automatically detected, and we will be asked to choose a Studio project we want to use.
Once connected, the BrickML will show up in the Devices section of our Edge Impulse Studio project, and it should be ready to be used for data collection and model training.
Note: the steps I will follow in this guide are generic, so it should be easy to apply them on similar projects.
printing
class, I used a slightly modified G-code file from a previous 3D print, and re-played on the printer. The idle
and off
labels are a baseline to be able to detect when the 3D printer does nothing.
The collected samples were split into smaller chunks, and then arranged into Training and Test sets with close to 80/20 proportion:
printing
, and idle
/ off
classes are well separated.
printing
, idle
and off
states well separated. We have a small number of idle
and off
samples overlapping, but this is expected as the two categories are quite similar.
idle
and off
states.
As the model works as expected, we should try Live classification on newly sampled data from the BrickML device. For this, first we need to connect to the BrickML device, either using edge-impulse-daemon
or Web USB. After this, we can start collecting some sensor data, by hitting the “Start sampling” button with the appropriate parameters:
.zip
archive containing two files: a signed binary firmware image, and an uploader script.
ei_uploader.py
script, by running the following command:
edge-impulse-run-impulse
command.
Here is quick video showing the BrickML in action, while running the model: