Bridging the gap between resource-constrained microcontrollers and larger processors in robotic applications that are based on the Robot Operating System.They go on to note -
Microcontrollers are used in almost every robotic product. Typical reasons are:So where does AI fit in here? It may seem perhaps an unusual approach - to take something that has traditionally been reserved for high powered processors (running neural networks) and use a tool specifically designed for low-level, memory constrained devices (MicroROS) - but these are precisely the presuppositions TinyML seeks to challenge. By combining MicroROS and Edge Impulse, the path to creating your own plug-and-play AI-driven peripherals for ROS2 systems becomes much more straightforward. This enables experimentation with a “distributed” approach to AI in robotics, wherein neural networks are run much closer to the sensors, and the central ROS2 computer can enjoy the benefits of model inferences without being bogged down by running many neural networks simultaneously.
- Hardware access
- Hard, low-latency real-time
- Power saving
EIClassification
: Contains a label and value, like {'label': 'cat', 'value': 0.75}
. One classification contains one class name and the probability given to that class by the neural network.EIResult
: Contains multiple classifications - as many as your neural network needs. A full result looks like this: [{'label': 'cat', 'value': 0.75}, {'label': 'dog', 'value': 0.25}]
.ei_interfaces
directory. This folder contains everything you need to build the custom message types.
To add it to your ROS2 system, navigate to:
ros2_ws/src
and paste the ei_interfaces
directory inside. cd
back to your main ros2_ws
directory and from the terminal run colcon build
.
You can confirm the message types were added by running the following from the terminal:
ros2 interface list | grep EI
You should see:
ei_interfaces
directory inside the special extra_packages
directory in the Arduino library. For me the path is:
micro_ros_arduino-2.0.5-humble
directory, and use the docker commands from this part of the MicroROS Arduino readme:
-p
flag at the end - it significantly reduces the build time if you specify your target. You can also run the command without this flag to build for all available targets, but it’ll take a while.
.ino
file to your Arduino Portenta, and make sure the .h
header file is in the same directory. I won’t be writing a line-by-line explanation of the code here - but here is some info on key points that make this all work.
Make sure to change the name of the included Edge Impulse library to the name of your own project:
ei_result_publisher
file, note that we include the two message types we added before:
EIResult
is a sequence (array) of EIClassification
messages, and in MicroROS you need to allocate memory for your message when setting everything up. Even if your neural network has more labels than than the 2 that I have for this project (human, background), the code will still work fine as it will automatically allocate enough memory for however many labels (and hence classifications) your EIResult
message needs to support. You can see the section where the memory is allocated here:
msg
is initialized as type:
.ino
file, you’ll see that a lot of the code is taken directly from the Edge Impulse ei_camera
example code here. Let’s focus on the moment that the ei_impulse_result_t
object is transferred to the MicroROS publisher: