Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Use an Arduino Nicla Sense ME and Edge Impulse model to determine your rowing cadence and provide feedback via the IoT Remote app.
Created By: Justin Lutz
Public Project Link: https://studio.edgeimpulse.com/public/90788/latest
https://github.com/jlutzwpi/Arduin-Row/tree/main
The sport of rowing, even on a rowing machine, is a technical one. Most people can hop on a rowing machine and start rowing, but may not be generating the most power possible. Many think that rowing involves pulling the handle with your hands, arms, and back, but in order to generate the most power you actually have to push with your legs. That is where you can generate the fastest times on the rowing machine.
Image source: British Rowing Technique - British Rowing
However, in order to know if you are doing it correctly, many times it requires a one on one session with a coach so they can evaluate your form and provide suggestions.
Using the power of accelerometer data on an Arduino board and Edge Impulse, I was able to make a virtual "Rowing Coach" that, based on the rower's tempo (and acceleration from the start of the stroke), can offer feedback. It can also offer feedback based on how the rowing handle moves through the stroke. This is to ensure the rower is keeping the handle level, and isn't raising or lowering the handle too much during the stroke, which can waste energy and reduce power. This feedback is offered through a chat-based feature of the Arduino IoT Remote app. Given that I am also using the Nicla Sense ME board, I am also reading and plotting estimated CO2 values (eCO2) to show how the CO2 values in the air change while you work out.
This project went through multiple variations (more to come on that later), but I ultimately settled on using the Nicla Sense ME as a shield on the Arduino MKR Wifi 1010. The Nicla Sense only comes with BLE, so using it as a shield allowed me to access the Wifi of the MKR board as well as the Arduino IoT Remote dashboard, which offers a quick, easy, and slick app to link to the board. In order to turn the Nicla Sense into a shield, some soldering is involved. Arduino has a good tutorial on it here.
I used Edge Impulse to generate my TinyML model to predict the type of rowing I was doing as well as anomaly detection to determine if the rowing handle placement was correct.
Using a quick Arduino sketch loaded onto the Nicla Sense ME, I used the Edge Impulse command-line editor to read the accelerometer data directly into Edge Impulse project following their Data Forwarder example. The sketch below is how I read the accelerometer data into Edge Impulse from the Nicla.
So, a quick word of caution here: I started this project intending to just use the Nicla Sense ME and develop a BLE app using MIT App Inventor. However, I found out with the 64 kB RAM limitation, that I was running out of memory running my 20 kB model on the Nicla Sense ME (I believe this is due to additional packages being loaded into RAM on the Nicla reducing the available memory). Using the Nicla Sense ME as a shield on the MKR Wifi 1010, I was able to run my model without memory issues. BUUUT, when used as a shield, the accelerometer frequency maxes out at 10 Hz (it took me a while to figure this out), so I had to downsample all of the data that I collected just using Nicla Sense ME from 100 Hz to 10 Hz (and also ensure that the orientation of the Nicla remained the same). This was frankly a nightmare that took me a while to figure out.
I collected about 18 minutes of data for 3 states: easy, low strokes per minute (spm), and high-spm divided between training and test data:
Once the data was collected, I set up my impulse:
I downsampled the collected data to 10 Hz to match the output of the Nicla Sense ME as a shield. I kept the window around 2000 ms and didn't change the window step size. On the Spectral Features tab, I changed the Scale Axes to 0.001 as I seemed to get better results with that (it was also recommended on a prior project by the Edge Impulse team).
Next, it was on to training the model:
With the data that I had, there was pretty good clustering between the classes. Once that I had the model trained, I went to the Anomaly Detection tab and selected the X-axis to determine if the rowing handle is remaining level throughout the stroke.
After that I deployed the model to an Arduino library. You can find my public project here if you want to look at the data.
I then added the zipped Arduino library to my application code in the Arduino IDE by going to Sketch > Include LIbrary > Add .ZIP Library...
I started my application sketch from the Arduino Web Editor since I would be linking my project to the Arduino IoT Remote app. However, since the IAQ and eCO2 values won't be read correctly unless you make some changes to the Nicla library (I've documented that here), I had to export from the web editor to my local Arduino IDE so I could use the edited Arduino_BHY2Host library.
That being said, I really like the ease of use of the IOT cloud interface. I defined a couple variables: air_quality and inference and the default sketch is auto-populated. In the Dashboards tab of the IOT cloud, I created a chat-like interface that I called "Coach's Orders". This gives you the feedback based on what your rowing stroke indicates. I also created a graph that shows the CO2 levels being read from the Nicla. The reason for that data is to show how working out affects the CO2 levels in the room. If you are working hard in a confined space and the CO2 levels rise to a dangerous level, you might want to get some ventilation or take a break.
Once I had the Dashboard set up and the variables defined, it was really just a matter of adding in the Edge Impulse inference logic to my Arduino sketch. Here is the main loop; the full code base can be seen in the code section.
Once the coding was complete, I loaded the sketch with my code on the MKR Wifi 1010. I then put the board into a breadboard and taped it to the rowing handle:
The board can be powered with either through the USB or via the JST connector with a 3.7V LiPO battery. I then hopped on the rowing machine and varied up the paces. This is what I saw on the app:
And a video with an overlay of the app with a little commentary from yours truly is up at the top of this post!
The model inference mapped pretty closely to what I was doing. If I was going light and not putting in much effort, I would get a "Push with your legs!" command. If I started to push harder but keep the tempo low, I would get a "Good power at low strokes per minute!" and if I went all out, it would say "Keep the pace up. High strokes per minute!" If I altered the height of the handle on the pull or on the return, "Keep the handle level!" would be added to the command. You can see that as I picked up the pace the CO2 values rose as well. I would be interested to see what values it reads if I'm rowing for more than just a minute.
This was a good project, and like the others I've done, had hiccups along the road that I had to overcome. I completely pivoted from a Nicla Sense ME / BLE app solution to using the Nicla as a shield on the MKR Wifi 1010 and using the Arduino IOT Cloud app as the final implementation. I spent a lot of hours combing through message boards on why I couldn't run the Edge Impulse model on the Nicla (out of memory) to why my model wasn't working with the Nicla as a shield (when used as a shield, accelerometer frequency drops from 100 Hz to 10 Hz). Hopefully this project helps you avoid some of the traps as fell in to. Happy hacking!
A wearable Nicla Sense ME that can measure both the environment, and your outdoor activities using machine learning.
Created By: Zalmotek
Public Project Links:
GitHub Repository:
https://github.com/Zalmotek/edge-impulse-arduino-k-way-outdoor-activity-tracker
Hiking is a great way to get outdoors and enjoy some fresh air. However, keeping track of your progress can be challenging, and that's where an outdoor activity tracker comes in handy. A hiking wearable device provides some valuable functions that can make your hike more enjoyable and safe. It can track things like how many steps you've taken, your walking speed, and even the weather conditions.
In this tutorial, we'll show you how to build a smart hiking wearable using the Arduino Nicla Sense ME board paired with the weather-resistant K-Way jacket.
We’ll present the following use cases for the Arduino Nicla Sense ME board:
Weather prediction - The wearable will be able to predict weather changes using the onboard pressure sensor and AI. By monitoring the atmospheric pressure, the tracker can notify you when a storm is approaching or when conditions are ripe for favorable weather. This information can be helpful in deciding whether to push on with your hike or turn back.
Activity tracking - The wearable will be able to track your steps and identify walking, climbing, or breaks taken during the hike.
Data gathering for ML - The Arduino Nicla Sense ME will send motion and environmental data to another device over a Bluetooth connection and the data will be stored in the Arduino IoT Cloud for future processing.
We'll use the Edge Impulse platform to train Machine Learning models using the data from the sensors, the Arduino IDE to program the Nicla Sense ME board, and the Arduino IoT Cloud to store data and visualize the metrics. By the end of this tutorial, you'll have a working prototype that you can take with you on your next hike!
Arduino Nicla Sense ME
LiPo battery (3.7V, 200mA)
Micro USB cable
Enclosure
K-Way jacket
Edge Impulse account
Arduino IDE
Arduino IoT Cloud account
The Arduino Nicla Sense ME is a tiny and robust development board that is specifically designed for wearable applications. It has several Bosch Sensortec's cutting-edge sensors on board, including an accelerometer, gyroscope, magnetometer, and environmental monitoring sensors. In addition, the board has an RGB LED that can be used for visual feedback and it can be powered by a LiPo battery. Furthermore, its compact form factor, high computing power, and low power consumption make it an ideal choice for edge Machine Learning applications.
Barometric pressure is used to forecast short-term weather changes so, for training the weather prediction model, we will use the digital onboard BMP390 low-power and low-noise 24-bit absolute barometric pressure sensor. This high-performance sensor is able to detect barometric pressure between 300 and 1250 hPa and can even be used for accurate altitude tracking applications.
For training the climbing detection model, we will use the onboard BHI260AP self-learning AI smart sensor with integrated 6-axis IMU (3-Axis Accelerometer + 3-Axis Gyroscope) together with the BMM150 3-axis digital geomagnetic sensor.
Housing your wearables in an enclosure is necessary because it protects the electronics from liquids or dust, as well as allows you to attach them securely onto clothing. In this project, we will be using a plastic enclosure for our Arduino Nicla Sense ME which features a hole for the USB port so that we can easily program the board.
In order to use the Edge Impulse platform, you will need to create an account. Once you have done so, log in and click on the "New Project" button. Enter a name for it, then select "Create Project". You should now be redirected to the project main page. Here, you will be able to configure the settings for your project, as well as add and train machine learning models.
The first step when designing a Machine Learning model is data collection, and Edge Impulse provides a straightforward method of doing this through their Data Forwarder, which can collect data from the device over a serial connection and send it to the Edge Impulse platform through their ingestion service. To use the Data Forwarder, install the Edge Impulse CLI following the steps from here.
To get started, you'll need to connect the Nicla Sense ME to your computer using a micro USB cable. Once it's connected, open up the Arduino IDE and go to the Board Manager (under Tools > Board) to install the board support package (Arduino Mbed OS Nicla Boards).
Next, go to Tools > Board > Arduino Mbed OS Nicla Boards and select the Nicla Sense ME board.
Download the Edge Impulse ingestion sketch from here and upload it to your board.
We will collect data for three classes:
Drop - This class will be used to detect bad weather conditions. A quick drop in air pressure indicates the arrival of a low-pressure system, in which there is an insufficient force to push clouds or storms away. Cloudy, wet, or windy weather is connected with low-pressure systems, as explained here.
Rise - This class will be used to detect good weather conditions. A sharp rise in atmospheric pressure drives the rainy weather away, clearing the sky and bringing in cold, dry air, as explained here.
Normal - This class will be used to detect stable weather conditions.
In the Arduino sketch you’ll find the ei_printf
function which sends data through a serial connection to your computer, which then forwards it to Edge Impulse. Depending on which class you want to collect data for, you’ll have to uncomment the corresponding line of code from the code snippet below. Since collecting enough real weather data for training the model would take a lot of time and is weather-dependent, for the purpose of this tutorial we will simulate the Rise and Drop classes using the barometerValueHigh()
and the barometerValueLow()
functions which generate arbitrary data based on an initial reading of the real measured pressure. To collect data for the Normal class, uncomment the barometer.value()
function.
From a terminal, run:
edge-impulse-data-forwarder
This will launch a wizard that will prompt you to log in and select an Edge Impulse project. You will also have to name the device and the axes of your sensor (in this case our only axis is barometer). You should now see Nicla Sense in the Devices menu on Edge Impulse.
With the data forwarder configured, we can now start collecting training data. Go to Edge Impulse > Data acquisition > Record new data, write the name of the class in the Label prompt, and click on Start Sampling. Each sample is 10s long and you should collect at least 2 minutes of data for each class. For the Rise and Drop classes, each time you collect a new sample you’ll have to press the reset button on the Nicla Sense board to reset the readings.
Your collected samples should look something like this:
Now that you have enough training data, you can design the impulse. Go to Impulse design > Create impulse on Edge Impulse and add a Spectral Analysis processing block and a Classification (Keras) learning block.
An impulse consists of a signal processing block used to extract features from the raw input data, and a learning block which uses these features to classify new data. The Spectral Analysis signal processing block applies a filter to remove noise, performs spectral analysis on the input signal, and extracts frequency and spectral power data. The Classification (Keras) learning block is trained on these spectral features and learns to identify patterns in the data that indicate which class a new data point should belong to.
Click on Save Impulse, then go to Spectral features in the left menu. You’ll se the raw signal, the filtered signal, and the spectral power of the signal.
Click on Save parameters and you will be prompted to the feature generation menu. Glick on Generate features and when the process is done you will be able to visualize the Feature explorer. If your classes are well-separated in clusters, it means the model will easily learn how to distinguish between them.
Now go to NN Classifier and start training the model. At the end of the training you’ll see the accuracy and the loss of the model. A good performing model will have a high accuracy and a low loss. In the beginning, you can use the default training settings and adjust them later if you are not satisfied with the performance results.
Go to Model testing and click on Classify All to see how your model performs on new data.
Finally, go to Deployment and export the trained model as an Arduino library.
Unzip the downloaded library and move it into your libraries folder in your Arduino workspace. At Files > Examples > Examples for custom libraries > your_library_name > nicla_sense > nicla_sense_fusion you’ll find a sketch for running inference on your board. We’ll use the onboard RGB LED for visual feedback as follows:
Red - pressure drop;
Green - pressure rise;
Blue - normal pressure.
You can turn on the LED by adding the following lines of code to the sketch:
You can find the full code and the trained model here.
Create a separate project on Edge Impulse and give it a name.
Download the Edge Impulse ingestion sketch from here and upload it to your board.
Again, we will use the Data Forwarder to collect data, so run the following command from a terminal:
edge-impulse-data-forwarder --clean
This will launch a wizard that will prompt you to log in and select the Edge Impulse project. The --clean
tag is used when you want to switch to a new project in case you’ve previously connected a project to the Data Forwarder. You will also have to name the device and the axes of your sensor (in this case the axes are in the following order: accel.x, accel.y, accel.z, gyro.x, gyro.y, gyro.z, ori.heading, ori.pitch, ori.roll, rotation.x, rotation.y, rotation.z, rotation.w). You should now see Nicla Sense in the Devices menu on Edge Impulse.
We will collect data for three classes, as described in the previous section:
Walking
Climbing
Staying
The Spectral Analysis signal processing block can identify periodicities in data, which is helpful in this case since the motion and orientation data will have a predictable pattern when the user is sitting, walking, or climbing.
Navigate to NN Classifier and begin training the model. Adjust the default training parameters if needed, in order to obtain a better training performance.
Finally, go to Model testing and click on Classify All to check how your model performs on new data.
Now you can deploy your model as an Arduino library by going to Deployment > Create library > Arduino library. You can also enable the EON Compiler to optimize the model.
We will be using the Arduino IoT Cloud to store the data from the Nicla Sense ME board and visualize the metrics. The platform provides an easy-to-use interface for managing devices, sending data to the cloud, and creating dashboards. In order to use the Arduino IoT Cloud, you will need to create an Entry, Maker, or Maker plus account that allows you to create an API key for sending data online.
To generate your API credentials, follow the steps below:
Access your Arduino account.
Go to the Arduino Cloud main page.
In the bottom left corner, click API keys, and then CREATE API KEY. Give it a name and save it somewhere safe. After this, you will no longer be able to see the client secret.
Now go to Arduino IoT Cloud and in the Things menu create a new Thing called Wearable.
Click on the newly created thing and add variables for the metrics you want to monitor. We’ve used the following ones:
With the Arduino IoT Cloud configured, we can now start sending data from our device. To do this, download this project (which is an adaptation based on this) and add the Arduino_BHY2 folder to your Arduino libraries. Go to Examples > Arduino_BHY2 > App and upload this sketch to your device.
Now go to nicla-sense-me-fw-main/bhy-controller/src/ and run:
go run bhy.go webserver
A webpage will pop up and you’ll have to select Sensors. Turn on Bluetooth on your computer, then click Connect and select your Nicla board. After the devices are paired, enable the sensors you want to monitor and the webpage will start making requests to post data to Arduino IoT Cloud.
You can also configure a Dashboard to visualize your sensor data:
The Arduino Nicla Sense ME is a great board for building an outdoor activity tracker that has the ability to monitor your progress on hikes, predict weather changes before they happen and log data for training Machine Learning models. With the Edge Impulse platform, you can effortlessly train Machine Learning models to run on edge devices, and with the Arduino IoT Cloud, you can easily store data for future machine learning processing.
Paired with the weather resistant K-Way jacket, you'll be able take this device along any time you head outdoors making sure you'll be ready for any adventure ahead of you!
Train a TinyML model to detect the motion of falling down, then connect via Bluetooth to make an emergency call.
Created By: Thomas Vikstrom
Public Project Link: https://studio.edgeimpulse.com/public/183564/latest
GitHub Repo:
https://github.com/baljo/fall_detection
This project will showcase how the K-Way jacket & Arduino Nicla Sense ME device, together with a smartwatch, can be used to detect falls and call for assistance in case needed.
Video showing a simulated emergency call due to a sudden fall
In Finland, with a population of 5.5 million, the yearly mortality rate due to accidental falls is around 1,200 people. Approximately 50% of the mortal falls take place indoors, and 50% outdoor. The reasons for the falls are varying, but what is clear is that the older a person gets, the higher the risk is that she/he will fall, and secondly that the fall might be fatal. Falling is the most common accidental cause of death for people over 65 years in Finland (source ukkinstituutti.fi). In addition to the deaths, the total amount of 390,000 yearly falls (source Red Cross) are leading to human suffering and health care costs for the society.
As the population overall gets older and older, it is thus of increasing importance to be able to reduce the risk of falling and getting hurt. But in those cases where the accident anyhow happens, and the person is severely hurt or in worst case unconscious, it is crucial to get assistance as quickly as possible. For people living with family members or in a home for elderly, a shout for help might be enough, but when living alone it might take hours, or even days, until someone notices something is amiss. While a fall indoors can certainly be fatal, a fall outdoors during the darkest winter, or in the sparsely populated countryside, significantly increases the risk of a fatal outcome.
Finns in general, and elderly people in particular, are made of a tough and hard material (quite a few are also stubborn), which leads to that many try to live an active outdoors lifestyle, regardless of the weather conditions. This is all well and good as long as precautions are taken (e.g., using shoes with studs or spikes in the winter, or hike boots for hiking in the terrain). Nowadays also most people have a mobile phone and an increasingly number of people have some type of smartwatch.
Many existing fall detection systems use signals from accelerometers, sometimes together with gyroscope sensors, to detect falls. Accelerometers are very sensitively monitoring the acceleration in x, y, and z directions, and are as such very suitable for the purpose. The challenge with developing a fall detection system with the help of accelerometers, is that the data frequency typically needs to be quite high (> 100 Hz) and that the signals need to be filtered and processed further to be of use.
Apart from accelerometers, it is also possible to use e.g. barometers to sense if a person suddenly has dropped a meter or more. Barometers sense the air pressure, and as the air pressure is higher closer to the ground, one only needs grade school mathematics to create a bare bones fall detection system this way. Easiest is to first convert air pressure to altitude in meters, and then use e.g. this formula previous altitude in meters - current altitude in meters
, and if the difference is higher than e.g. 1.2 meters within 1-2 seconds, a fall might have happened. With barometers the data frequency does often not need to be as high as with accelerometers, and only one parameter (air pressure=altitude) is recorded. One major drawback is the rate of false positives (a fall detected where no fall occurred). These might happen because of quick changes in air pressure, e.g. someone opening or closing a door in a confined space like a car, someone shouting, sneezing, coughing close to the sensor etc.
Some modern and more expensive smartwatches, e.g. Apple Watch, already have in-built fall detection systems, that can automatically call for help in case a fall has been detected, and the person has been immobile for a minute or so. In case the watch has cellular connectivity, it does not even need to be paired to a smart phone.
In this TinyML project I showcase how the K-Way jacket and Arduino Nicla Sense ME device are, together with the Bangle.js smartwatch, used to detect falls and simulate a call for assistance in case needed. K-Way is an iconic brand, known by many for their waterproof clothes. Nicla Sense ME is a tiny low-power device suitable for indoor or outdoor activities. Sensors included are accelerometer, magnetometer, air quality sensor, temperature sensor, humidity sensor, air pressure sensor, Bluetooth connectivity etc. All this on a stamp-sized PCB!
To demonstrate how a detected fall could result in an emergency call, I connected the Nicla via Bluetooth to my Bangle.js 2 smartwatch. Bangle is an affordable open-source based smartwatch aimed for users with a low budget or who want to develop software themselves using Espruino, a Javascript-based language. In a real scenario, Nicla would be connected directly either to a smartphone or smartwatch with cellular connectivity, but as that was out of scope for this project, I instead simulate an emergency call being made from the Bangle watch.
Initially I intended to collect data for normal behaviour and activities like e.g., sitting, walking, running, driving, cycling etc. as well as from trying to replicate real falls on slippery ice outside. Due to the risk of injury when replicating real falls - or from "falling" asleep when sitting :-) - I instead decided to try the anomaly detection in Edge Impulse for the first time. Once again I was amazed how easy it is to use Edge Impulse to collect data and train a ML model with it!
To be able to use anomaly detection, you just need to collect data for what is considered normal behaviour. Later, when the resulting ML model is deployed to an edge device, it will calculate anomaly scores from the sensor data used. When this score is low it indicates normal behaviour, and when it's high it means an anomaly has been detected.
I followed this tutorial on the Nicla to get up and running. The Edge Impulse-provided nicla_sense_ingestion.ino
sketch was used to collect accelerometer data.
I started to collect 8-second samples when walking, running, etc. For the sake of simplicity, I had the Nicla device tethered through USB to a laptop as the alternative would have been to use a more complex data gathering program using BLE. I thus held Nicla in one hand and my laptop in the other and started walking and jogging indoors. To get a feeling for how the anomaly detection model works, I only collected 1m 17s of data, with the intention of collecting at least 10 times more data later on. Astonishingly, I soon found out that this tiny data amount was enough for this proof of concept! Obviously, in a real scenario you would need to secure you have covered all the expected different types of activities a person might get involved in.
Through a heuristical approach I found out that the optimal window size and increase is 500 ms when the frequency is 100 Hz. I also found the spectral analysis to be working well with anomaly detection
As this ML model was new to me, it was easiest to train it using the default settings. While I'm quite sure the model might be further tuned and optimized, especially after collecting more data and from different activities, the trained model was again of surprisingly good quality considering the few minutes I'd spent on it.
The deployment part consisted of creating an Arduino library that can be used with the example program provided by Edge Impulse. Initially I struggled to find the correct program from the library, but found out that I just needed to restart the Arduino IDE to be able to find the file, duh!
Next in line was to find a suitable threshold for when I consider an anomaly (= fall) detected. Again, with a heuristical approach I found an anomaly score of 50 to be a good threshold. To be able to walk around without Nicla being tethered to a computer, I adapted the program so the LED light blinks in red when I simulated a fall by shaking the device.
Until now, most steps in the process had been pretty straightforward with only some basic research and trial & error needed. Luckily, I had been prewarned by another Nicla Expert that running inference and Bluetooth simultaneously might cause memory issues on this 64kB SRAM device. This I experienced myself, but with the help of this Forum post, this challenge was overcome.
To be able to simulate an emergency call being made, I created a simple Javascript program on the smartwatch. This program connects through BLE to Nicla and receives the anomaly score. Once the score is over 50, the watch will react by turning on the LCD and displaying FALL DETECTED!
. After a few seconds a counter will decrease from 10 to 0, and if the wearer has not touched the display when the counter turns to zero, the watch is simulating an emergency call to a predefined number chosen by the user.
The following pictures show the fall detection process:
A fall is registered (= an anomaly detected) - in this case due to shaking the Nicla device, the LED blinks in red colour
Nicla sends the anomaly score to the Bangle watch through BLE
The Bangle watch also shows a fall is detected, starts counting down to zero
If the screen has not been touched - indicating the user is immobile, an emergency call is made
While this was only a proof of concept, it demonstrates how tiny low-powered TinyML devices can be used to detect falls, and together with cellular network devices call for assistance in case the user is immobile. To move from the prototype stage to a real-world solution, more activity data needs to be gathered. In addition, Nicla should be connected to a phone to enable emergency calls. For this a smartphone app should be developed, e.g. with MIT App Inventor.
A small device that monitors packages in transit for unsafe handling such as shaking, throwing, drops, or other damaging movements.
Created By: Shebin Jose Jacob
Public Project Link:
Safe transit of packages is a headache for many online retailers since damage can occur during transit. Damaged shipments lead to a greater number of product returns, a poor customer experience, and negative reputation of the retailers. Since we have no knowledge of what happens during delivery, it might be difficult to prove that a package's contents were destroyed during travel, as opposed to before or after transfer to a carrier. Due to this lack of transparency, both unfair accusations or deliberate fraud are possible.
As a solution, we are designing a device that can monitor the safe handling of packages during transit. This device uses an Arduino Nano 33 BLE Sense and a GSM module, along with Edge Impulse to identify the locations where impetuous handling of the package occurs. Using the accelerometer data, the device can identify possible insecure handling that occurred during the transit. When insecure handling is detected by the device, an alert is generated in our system which logs the current time of the incident. The log can be analyzed by the retailer as part of an internal audit to analyze the performance of the courier services. The user can also track the handling of the package using the web interface provided.
This device is extremely useful in transporting Handle With Care packages that should be handled very carefully.
The Arduino Nano 33 BLE Sense has a 9-axis IMU (3-axis accelerometer + 3-axis gyroscope + 3-axis magnetometer), which makes it ideal for recognizing gestures. The movements are classified by an AI model into five classes - Hard Fall, Vigorous Shaking, Throwing, Normal Carrying, and Idle - after ingesting the data from the IMU. Hard Fall, Vigorous Shaking, and Throwing are categorized as "insecure handling" and the rest are categorized as "secure handling". Once an insecure handling event is detected by the AI model, the GSM module is activated and it logs the insecure handling event in Firebase. The events can be tracked by the user or an internal audit team to analyse the performance of carriers.
Arduino Nano 33 BLE Sense
SIM 800L GSM Module
TP4056 Module
Boost Converter
Li-ion Battery
Edge Impulse
Arduino IDE
The first step is to create a new Edge Impulse project. If you already have an account, you can create a new project by following the steps shown below. If you don't have an Edge Impulse account, sign up for a new account and follow the steps.
In our case we are classifying the gestures after analysing the accelerometer data from the IMU. So make sure you choose Accelerometer Data from the list.
After you have completed all the above steps, make sure you see your device in the Devices tab.
Machine learning begins with the data you provide. It is crucial to gather clean data so that your machine learning model can identify the proper patterns. How accurate your model is, will depend on the quality of the data you provide the computer. Inaccurate or out-of-date data will result in inaccurate results or predictions that are irrelevant.
As it will directly impact the result of your model, be sure to obtain data from a reputable source. Good data is pertinent, has few duplicated and missing information, and accurately represents all of the classifications and subcategories that are there.
As we are building a motion recognition model, we are collecting the accelerometer data from the IMU.
To collect the data, navigate to the Data Acquisition tab.
There you can see a Record new data tab, where we can collect data from the supported devices connected to Edge Impulse. Setting all the data sampling parameters as shown in the figure, now it's time to collect some data. Vary the sampling parameters according to your needs.
We embedded the Arduino Nano 33 BLE Sense on a cushion to collect the required data for training.
Finally, we have 28 minutes of accelerometer data. The data is collected under 5 different classes - Hard Fall, Vigorous Shaking, Throwing, Normal Carrying, Idle.
After we have our data, we have to prepare it. For this we can do the following.
Visualize the data to understand how it is structured and understand the relationship between various variables that may help us in designing the impulse.
After we visualise the data we can determine whether the data is useful for model training. Remove unwanted data to create a clean dataset.
Once we have a clean dataset, split it into training and testing datasets. Here we split them up into two groups, in the ratio 80:20. If your data is not split, perform a test/train split either from Data Acquisition Tab or from Dashboard
By creating an Impulse in Edge Impulse, you're creating your own machine learning pipeline. Navigate to Impulse design > Create Impulse
An impulse contain 3 blocks : input block, processing block and a learning block. I have chosen Time series data as my input block, Spectral Analysis as the processing block, and Classification (Keras) as the learning block. You can add an anomaly block if you're interested in detecting anomalous motions, I'm just omitting it for now.
Next, select Save Impulse, then navigate to the Spectral Features in the Impulse design panel by clicking on it. Once the parameters have been generated, wait a moment and then click Save parameters.
Now proceed to the Generate features tab and then click Generate features. When the process is finished, the feature explorer tab will allow you to view your dataset. This allows you to quickly verify whether your data is properly clustered or not.
Model training is the phase in which the neural network tries to fit the best combination of weights and biases to a machine learning algorithm to minimize a loss function.
In the NN Classifier tab, under the Impulse Design menu, we can configure various parameters that influence the training process of the neural network. I have changed the default values as shown in the image to attain better accuracy.
After finishing, you will see some training performance metrics, like Accuracy and Loss. Our trained model has an accuracy of 95.4%, which would suffice for our needs.
Once the model is trained, we can now test it to see how it performs with new data. Select Classify All under Model Testing. The model's performance on our testing data is displayed in the Model testing results tab. Our accuracy is 91.3%, which is still quite good. You can also look at the Confusion matrix to determine which labels are most susceptible to error. On the basis of this, you can expand the training dataset for these classes with additional items.
In addition to classifying the test data, we can head on to Live Classification to collect real-time data and classify it in real-time. This will ensure that the model is working flawlessly with real world data.
Once the model is trained and tested, it's time to deploy it back to the device. For this, navigate to Deployment > Build Firmware. Select Arduino Nano 33 BLE Sense and Build. It will generate the model and download it to your computer as a Zip file. Add the downloaded Zip file in the Arduino libraries, and you are good to go.
There are choices for choosing the NN classifier optimizations in Edge Impulse. We can improve device performance by choosing the best option. Edge Impulse will suggest the option that is optimal for our needs. We will achieve the same accuracy with less memory if we enable the Eon compiler.
Firebase is a platform for creating both mobile and online applications. Thanks to Firebase, developers may now concentrate on creating amazing user experiences. No Server management is necessary, no need to create APIs. Firebase is your server, API, and data storage, all of which are constructed in such a generic way that you can adapt them to the majority of demands. In our project, we are using Firebase real-time database to instantly post and retrieve data.
To find your Firebase Config:
Go to Firebase
Then go to Settings > Project Settings
Under Your Apps > SDK Setup and Configuration > Config (Sample given below)
The web interface is designed in such a way that it can reflect all the events updated in the Firebase database. The insecure handling events are updated in the Firebase directly from the Nano 33 BLE Sense and other shipping updates can be updated using an API.
The heart of the Package Tracker is an Arduino Nano 33 BLE Sense. We opted for this board due to its tiny form factor and its high capability. It comes with a series of embedded sensors and here we use the LSM9DS1 sensor which is a 9-axis inertial module. It comes with one extra hardware serial port which benefits here, to connect with the GSM module.
The power supply for the device is a 3.7V 18650 Li-ion cell with a current of 2000mah.
This cell can be charged via a micro-USB port. The charge controller used in this project is TP4056.
The power coming out of the TP4056 module is actually not enough for the Vin of Arduino . So we used this tiny boost converter module.
For sending the notification to Firebase we use a Sim800l GSM module. This module is from Simcom, and gives any microcontroller GSM functionality, meaning it can connect to the mobile network to receive calls and send and receive text messages, and also connect to the internet using GPRS, TCP, or IP. The communication between the Arduino and GSM module is serial.
We designed and 3D printed these tiny case for securing all the hardware elements. This can be easily attached to any package using wiring ties.
Then we secured all the elements one by one in the box and tied it to the package.
This project demonstrates a low-cost way to monitor a package through the shipment process, detecting 4 different classes of insecure handling.
A client/server device to detect and analyze worker falls with machine learning and an Arduino Nano 33 BLE Sense.
Created By: Roni Bandini
Public Project Link:
A fall could be dangerous in any situation, but for certain working scenarios, consequences can be very harmful. Therefore, the idea of developing a Machine Learning fall detection and reporting system could be quite useful in some industries.
Each worker has a small TinyML device in charge of detecting falls via the onboard accelerometer data, and reporting to a server through Bluetooth. The server is a Raspberry Pi running a Python script that scans specific BT announcements, parses the fall alert information, and stores it into a SQL Lite database for reports and alerts.
The electronics part of the client build is easy: just a battery, a TP4056 and the Arduino Nano 33 BLE Sense. The board has an onboard accelerometer, onboard RGB led, and enough processing power to run an Edge Impulse library for inferencing locally.
If you want to train your own fall model, go to the Edge Impulse Studio and log in, click on Data Acquisition, WebUSB, and choose the Inertial sensor. Obtain 5 minutes of data; Standing normally and Falling Down samples.
Design an Impulse with a 1500ms window size, 150ms window increase, and 100HZ frequency. Add Spectral Analysis with just 3 axis: accx, accy, accz. Choose Keras classification and 2 output features: Stand and Fall. For the Neural Network training, 50 training cycles with a 0.0005 learning rate, Autobalance the dataset, and 20% validation worked fine.
After model testing, go to the Deployment page and export an Arduino Library (which will contain your Machine Learning Model). Then import this library (Zip file) inside the Arduino IDE Sketch by selecting Include, Add Zip.
Once running, every fall is advertised with this format:
advertiseFall("Fall-"+worker+"-"+String(myCounter));
For example: Fall-Smith-1922
The device will change it's RGB LED from green to red, whenever a fall is detected.
The other component we need to build next is the Python and database server, listening for bluetooth data coming from the Arduino. A Raspberry Pi will run the code fine, so, simply install Raspberry Pi OS Lite on an SD Card, boot up, and upload the Python files linked above from the GitHub repo.
Next, create a database structure with:
$ sudo python3 databaseSetup.py
Start scanning for bluetooth packets from the Arduino with:
$ sudo python3 scan.py
Other scripts included are: clearDatabase.py
(removes all database records), and chart.py
(creates a chart rendered from all of the database records).
In this project, we have demonstrated a simple method for Fall Detection using a client / server system running on an Arduino Nano 33 BLE Sense turned into a wearable device, along with a listening server running on a Raspberry Pi.
Take an existing gesture recognition model built for the Thunderboard Sense 2, and prepare it for use on the SiLabs xG24 board.
Created By: Mithun Das
Public Project (to Clone):
In this project I am not going to explore or research a new TinyML use-case, rather I'll focus on how we can reuse or extend Edge Impulse Public projects for a different microcontroller.
In this project, I am going to walk you through how you can clone his Public Edge Impulse project, deploy to a SiLabs Thunderboard Sense 2 first, test it out, and then build and deploy to the newer SiLabs xG24 device instead.
Before you proceed further, there are few other software packages you need to install.
LightBlue - This is a mobile application. Install from either Apple Store or Android / Google Play. This will be required to connect the board wirelessly over Bluetooth.
Click on the "Clone" button at top-right corner of the page.
That will bring you to the below popup modal. Enter a name for your project, and click on the "Clone project" button.
The project will be duplicated from Mani, into your own Edge Impulse Studio. You can verify by looking at the project name you entered earlier. Now if you navigate to "Create impulse" from the left menu, you will see how the model was created originally.
As you can see, the model was created based on 3-axis accelerometer data. The Window size was set as 4s and window increase was set to 4s as well, which ensures there is no overlap. That means if the input data is of 20s, there will be 5 samples from that data. Spectral Analysis was selected as Digital Signal Processing and Keras was selected for the Learning block.
Next, navigate to "Retrain model" from the left menu and click on "Start training". Alternatively, you can also collect more data and train the model with your own gesture movement, or add additional gestures.
When you are done retraining, navigate to the "Deployment" page from the left menu, select "SiLabs Thunderboard Sense 2" under "Build firmware", then click on the "Build" button, which will build your model and download a .bin
file used to flash to the board.
If not already connected, go ahead and connect the Thunderboard Sense 2 to your computer via a USB cable. You should see a drive named TB004
appear. Drag and drop the .bin
file downloaded in the previous step to the drive. If you see any errors like below, you'll need to use "Simplicity Studio 5" to flash the application, instead.
The LEDs on the Thunderboard will flash, indicating the firmware has been updated on the board. Open the LightBlue app, and scan for devices. You should see a device named "Edge Impulse", go ahead and connect to that. Then tap on the "0x2A56" characteristic, then "Listen for notifications". Change the format from "Hex" to "UTF-8 String".
Then, back on the computer, open a Terminal and run the below command, which will start inferencing on the board.
Alternatively, you can write a boolean 1
to the characteristic to start the inference on the board. Checkout the .gif for how to do it on an xG24:
To test if the model is working accurately, move your finger to recreate the gesture pattern that you (or Manivannan!) trained, and you should see a notification on your phone.
At this point, you have learned how to clone a Public Edge Impulse project, capture more data, and deploy to a Thunderboard Sense 2 directly.
Now, we will explore how we can deploy the same model to the newer Silabs xG24 hardware instead. Keep in mind that there are some upgrades and differences between the Thunderboard Sense 2 and the xG24, and the not all of the sensors themselves are identical (some are though). For this project specifically, the original work done by Manivannan recorded data from the Thunderboard's IMU, but the xG24 has a 6-axis IMU. So, we should recollect new data to take advantage of this and ensure our data is applicable. If your use-case is simple enough that new data won't data won't be needed, the sensor you are using is identical between the boards, or that your data collection and model creation steps built a model that is still reliable you might be able to skip this.
If your data is indeed simple enough, you can deploy the model straight to an xG24 without making any changes to the model itself. You only need to revisit the "Deployment" tab in the Edge Impulse Studio, select "SiLabs xG24 Dev Kit" under Build firmware, and Build. For the sake of demonstration we will give it a try in this project, though as mentioned the upgrade from 3-axis to 6-axis data really should be investigated, but the sensor itself is identical between the boards so our data should still be valid. We'll give it a try.
This will download a .zip file containing a .hex
file and instructions.
Once the flashing is done, again use the LightBlue app to connect to your board, and test gestures once again as you did for the Thunderboard Sense 2.
Once deployed, your same gestures should be recognized and inferencing is performed on the xG24 in the same manner as the Thunderboard Sense 2, with no additional model training or dataset manipulation needed! This makes upgrading existing projects from the Thunderboard to the xG24 extremely simple when the data and sensor output are comparable and you confirm the sensor is the same part number. Again, you'll need to cross check the datasheets, but you may find you can take a Thunderboard Sense 2 project and deploy it directly to the xG24.
One final note is that in this project, the xG24 is roughly twice as fast as the Thunderboard Sense 2 running the same model:
Hopefully this makes upgrading your SiLabs projects easier!
A wearable for continuous gait analysis, aiming to detect gait abnormalities indicative of potential medical conditions.
Created By: Samuel Alexander
Public Project Link:
Subtle changes in gait can be early indicators of various medical conditions, including neurodegenerative diseases like Parkinson's, multiple sclerosis, balance disorders, and even other injuries with far-reaching health consequences. Early detection often relies on subtle changes in how a person walks, such as reduced speed, shuffling steps, or unsteadiness. Unfortunately, current assessments primarily rely on periodic, in-clinic observations by healthcare professionals, potentially missing subtle yet significant changes occurring between visits. Moreover, subjective self-assessments of gait are often unreliable. This lack of continuous, objective monitoring hinders timely diagnoses, limits the effectiveness of treatment plans, and makes it difficult to track the progression of gait-related conditions. A proactive, data-driven solution is needed to ensure individuals and their healthcare providers have the information necessary for informed decision-making.
Image Credit: Can Tunca, "Human Gait Cycle", 2017, via mdpi.com
This project aims to develop a wearable device for early gait disorder detection. We'll begin by collecting data representing normal gait patterns during walking, running, and standing. Next, we'll extract relevant features using Edge Impulse tools, focusing on characteristics like leg swing acceleration, stride length, and foot placement (supination/pronation). Employing Edge Impulse's K-means anomaly detection block and feature importance analysis, the device will learn to distinguish healthy gait patterns (based on the individual's established baseline) from potential anomalies. Initially, inference results will be displayed on a smartphone app. This proof-of-concept can be expanded into a wearable device that alerts users of gait abnormalities and trends, recommending healthcare consultations when appropriate. Ultimately, our goal is to provide a proactive tool for early disorder identification, enabling timely intervention and improved outcomes.
The Nordic Thingy:53 leverages the nRF5340 Arm Cortex-M33 SoC, providing the computational resources necessary for on-device AI inference. It also includes a built-in accelerometer to capture detailed gait data and Bluetooth 5.4 for wireless communication. Importantly, the same nRF5340 chip powers the nRF5340 Development Kit, providing a consistent hardware platform throughout the project's development cycle. This means we can easily prototype on the Thingy:53, refine algorithms and sensor selections on the Development Kit, and ultimately transition to a custom wearable design for mass production – all using the same core chip. This approach ensures a smooth and efficient development process.
Nordic Thingy:53
3D printer
Edge Impulse CLI
nRF Programmer App (iPhone/Android)
nRF Connect Desktop
The Thingy:53 was used for collecting a dataset for training the AI model to establish a baseline of normal, healthy gait patterns. This dataset includes three types of movement: standing, walking, and running. To capture realistic data, the user wore the device while performing these activities. The dataset's variety helps the model accurately classify different gait patterns and detect potential abnormalities in various situations.
Collect data for each label (standing, walking, running) using the nRF Connect app. Choose:
Sensor: Accelerometer
Sample Length (ms): 20000
Frequency (Hz): 20
For each label, we collected 13 repetitions of 20000 ms which equals to 260 seconds for each label. This seems to be plenty enough for our testing, however more data may be necessary if the gait patterns are performed with a larger variety of terrains.
Split the 20000 ms sample into 4 sections of 5000 ms windows.
Perform a train/test split if needed, or try to aim for approximately an 80/20 ratio.
After thorough testing, including using the EON Tuner, the optimal settings for our time series data were determined. We employ both a classifier and K-means anomaly detection to enable both gait pattern classification and anomaly scoring.
Spectral analysis transforms raw accelerometer data from the time domain into the frequency domain. This reveals hidden patterns in gait data, such as stride frequency, step regularity, and harmonic components of movement patterns. These extracted spectral features can provide a richer representation of gait characteristics for the neural network, often leading to improved classification accuracy and a clearer understanding of potential gait abnormalities.
As mentioned, these parameter settings are alread using optimized values from the EON Tuner.
These are our results before using the EON Tuner (default parameter values and settings).
The EON Tuner is a valuable tool for finding the best parameter settings and model architecture to maximize accuracy. While it can also optimize for performance or memory usage, our project has sufficient resources in these areas. Therefore, we prioritize accuracy as the primary optimization goal.
K-means clustering is chosen for gait anomaly detection due to its computational efficiency and ability to robustly identify distinct clusters. While Gaussian Mixture Models (GMMs) can model more complex data distributions, in our testing K-means excels in identifying distinct clusters like normal walking, running, and standing.
We chose L1 Root Mean Square (RMS) for anomaly detection with accelerometer data (accX, accY, accZ) due to its sensitivity to outliers and interpretability. L1 RMS emphasizes large deviations, which helps identify significant gait abnormalities and provides insights into the specific directions of those anomalies. It's also more robust to noisy accelerometer data compared to L2 RMS.
Now the AI model is ready to be deployed to the Edge. Nordic Thingy:53 is selected for our deployment option. For this project we chose unoptimized (float32) to preserve accuracy since our hardware has enough performance and memory headroom.
This project successfully demonstrates the potential for wearable AI solutions in early detection of gait disorders. By harnessing the Thingy:53's capabilities and Edge Impulse's streamlined workflow, we developed a device capable of identifying gait anomalies. This tool offers proactive health monitoring, with the potential to alert users to subtle changes that may foreshadow underlying medical conditions. Future work could expand the dataset for greater robustness, explore additional sensor modalities, and conduct clinical trials to thoroughly validate the system for diagnostic use.
See this project in action:
In order to collect accelerometer data directly from the Arduino Nano 33 BLE Sense, we should connect the device to the Edge Impulse Studio first. Follow the steps given to connect the device with the Studio.
The entire code and assets are available at :
To add install Edge Impulse Firmware on the Nano 33, simply download the firmware from this link . Unzip the contents, connect the Arduino to your computer with a microUSB cable, double-click the Reset button on the Arduino, and run flash_window.bat
from inside the folder (or the Mac or Linux commands if you are on one of those platforms).
All of the code for this project, including both the Client script file and the Python server files can be downloaded from .
The device will work without a case of course, but, to make it more convenient to wear and to hold all the pieces in place, two parts should be 3D printed and a strap should be attached. The Gcode files for this particular design can be .
had created a project which is a wearable device running a tinyML model to recognize gesture patterns and send a signal to a mobile application via BLE. Check out his work for more information.
Edge Impulse CLI - Follow to install necessary tooling to interact with the Edge Impulse Studio.
Simplicity Studio 5 - Follow to install the IDE
Simplicity Commander - Follow to install the software. This will be required to flash firmware to the xG24 board.
If you don't have an Edge Impulse account, signup for free and log into . Then visit the below to get started.
If you are going to add new data, you can follow this to connect your board to the Edge Impulse Studio and capture data. Once you are done with collecting additional data, you will need to retrain the model of course.
Now you can use Simplicity Studio 5 to flash the .hex
file to the xG24 as shown before, or use Simplicity Commander. You can read more about Commander .
A 3D printed shoe clip-on case modification is made for attaching the Thingy:53 to a shoe. You can download the .stl files here:
This project assumes basic familiarity with connecting the Thingy:53 to Edge Impulse via the nRF Connect app. If needed, refer to this guide for assistance:
After building our model, we'll get the new firmware. Follow this guide to flash the firmware:
Use a Nicla Sense ME attached to the sleeve of a K-way jacket for gesture recognition and bad weather prediction.
Created By: Justin Lutz
Public Project Link: https://studio.edgeimpulse.com/public/181395/latest
GitHub Repo:
https://github.com/jlutzwpi/K-way-Nicla-Smart-Jacket
With microcontrollers getting smaller, more powerful, and more energy efficient, Artificial Intelligence (AI) is finding itself deployed more and more at the edge: on sensors, cameras, and even clothing!
For this project, K-way, a maker of jackets, clothing and accessories, teamed up with Arduino to see how K-way's products could be made smarter. I was fortunate enough to be sent a K-way jacket with an Arduino Nicla Sense ME, a custom case and battery, and a lanyard.
When I was brainstorming ideas for the jacket, I looked on the web to see if there was even such thing as a smart jacket. There were only a couple examples, but one such concept had mentioned gesture recognition as a potential capability of the jacket. Given the active nature of the brand, I figured that some sort of gesture recognition and environmental data collection would be beneficial to the wearer of the jacket out on a hike.
That gave me the idea of the Nicla Sense ME being mounted to the arm sleeve of the jacket, so gestures could be used to send commands to a smartphone, and atmospheric pressure data could also be sent to the smartphone app to alert the hiker of any upcoming bad weather.
In this proof of concept, I picture a hiker being out on the trail enjoying their leisurely walk. They see an object of interest (an owl's nest, some old ruins?), and they draw the letter "C" in the air, for checkpoint. This gesture then uses the phone's GPS to mark the checkpoint on the map of a custom app that is connected to the Nicla Sense ME via Bluetooth Low Energy (BLE), so the hiker knows exactly where that object of interest is. The Nicla also monitors the barometric pressure, which can be used as an indicator of bad weather. If a low-pressure storm starts to move in, the hiker can be alerted that bad weather is on the way and head back to their car before they get caught in a storm.
Ideally, there would be a pouch on the armsleeve to slide the Nicla Sense ME into, but for this demo I used the included lanyard to strap it to my wrist. I have an LED illuminated there to indicate if the Nicla Sense ME is connected via BLE to the app that I made with MIT App Inventor 2.
To complete this project I used my go-to source, Edge Impulse, to ingest raw data, develop a model, and export it as an Arduino library. I followed this tutorial on the Nicla Sense ME from Edge Impulse to get up and running. The Edge Impulse-provided nicla_sense_ingestion.ino
sketch was used to collect the raw accelerometer data. I created 3 classes: idle (no movement), walking, and checkpoint. The Checkpoint class was essentially me drawing the letter "C" in the air to tell the app to mark a checkpoint on the map while out on a hike.
You of course could add additional gestures if you wanted to expand the functionality of the jacket and Nicla Sense ME (an "S" for "selfie" maybe?). Even with just 15 minutes of data (split between Training and Test), there was great clustering of class data:
Default parameters were used throughout the Edge Impulse pipeline, and training concluded with great results:
I then ran my model through the Test set and exported the model as an Arduino library .zip file. The Public Project version of my model can be found here in the Edge Impulse Studio.
Once it was exported to an Arduino library, I used the sample nicla_sensor_fusion.ino
sketch from the library to start my project. Given that the inference code was already in there, I only had to add in the BLE code and some logic to send data over BLE to the app. Edge Impulse really does make it simple.
Once I had my code in place, I went over to MIT App Inventor to create the interface to the Nicla Sense ME BLE data.
This is where I ran into a small issue: the Nicla Sense ME has limited (64 kB) SRAM. The Nicla can run an Edge Impulse Model fine, and it can connect to BLE just fine, but if you try to do both, you get an Out of Memory error and your software crashes. After a couple hours of debugging, troubleshooting, and searching, I found a project by Nick Bild where he made a small change to the Arduino_BHY2 library to free up some memory, and allow both the BLE library and the Edge Impulse model to be run concurrently. Details can be found in this thread on the Edge Impulse forum. Once I was free of the memory limitations, I was able to connect to my app and test it out!
Aside from markers on the map, the Nicla Sense ME also continuously monitors the atmospheric pressure, which can be an indicator of upcoming bad weather. A drop in pressure means that clouds, wind, and precipitation may be coming. There are several factors that determine what that threshold is (location, altitude, etc) so I set my threshold to 29.8 in Hg, which would be an indicator of precipitation moving in to my area. Rather than waiting for bad weather to arrive, I simulated a lower pressure reading in my Arduino code and sent it to the app via BLE:
In summary, with the gesture recognition and mapping in place, combined with the weather prediction functionality, the Arduino x K-way collaboration makes for a great experience for outdoor enthusiasts. I hope you enjoy!
A SiLabs Thunderboard Sense 2 TinyML-based wearable belt for manufacturing workers, to detect correct / incorrect posture.
Created By: Manivannan Sivan
Public Project Link: https://studio.edgeimpulse.com/public/148375/latest
Working in manufacturing can put a lot of stress on a worker's body. Depending on the worker's role in the production process, they might experience issues related to cramped working conditions, heavy lifting, or repetitive stress.
Poor posture is another issue that can cause problems for the health of those who work in manufacturing. Along with that, research suggests that making efforts to improve posture among manufacturing employees can lead to significant increases in production. Workers can improve their posture by physical therapy, or simply by being more mindful during their work day.
Major postures include:
Posture while sitting
Posture while lifting
Many manufacturing employees spend much of their day sitting in a workstation, performing a set of tasks. While the ergonomics of the workstation will make a significant difference, it is important for employees to be mindful of their sitting posture.
Lifting can be another issue affecting the posture of those who work in manufacturing. If you are not careful, an improper lifting posture can lead to a back injury. For lifting objects off the ground, the correct posture is a "Squat" type where the incorrect posture is a "bent down" type.
I have created a wearable device using a SiLabs Thunderboard Sense 2 which can be fitted to a worker's waist. The worker can do their normal activities, and the TinyML model running on the hardware will predict the posture and communicate to the worker through BLE communication. The worker can get notified in the Light Blue App on their phone or smartwatch.
I have trained a model with several different postures, so that it can classify the correct movement postures while lifting and sitting, as well as incorrect postures. The model will predict results these 5 categories:
Correct Lift Posture - Squat
Incorrect Lift Posture - Bent Down
Correct Sitting Posture
Incorrect Sitting Posture
Walking
Now let's see how I trained the model and tested on real hardware in detail.
Connect the Thunderboard Sense 2 board to your system and flash the firmware from this below link.
https://docs.edgeimpulse.com/docs/development-platforms/officially-supported-mcu-targets/silabs-thunderboard-sense-2
Once it is flashed, run the below command.
edge-impulse-daemon
Now your board is connected to your Edge Impulse account.
To start to collect the data for desired postures (Correct , Incorrect) I have worn the belt with the Thunderboard Sense 2 attached, and started recording accelerometer data. The data classes acquired were walking, correct sitting posture, incorrect sitting posture , correct lifting posture (squat), and incorrect lifting posture (bent down).
I have recorded data from the Thunderboard by sitting in the correct sitting posture.
I have collected 1 minute data of correct sitting posture for model training and 20 seconds of data for testing.
For the incorrect sitting posture, I have bent towards the laptop, where my back is not resting on the chair. If an employee works in this position for long hours, it can create back pain or other problems in the future. I have collected 1 minute data of improper sitting posture for model training, and 20 seconds of data for model testing.
For lifting objects off the ground, the correct posture is to squat down to the object to lift it.
I have collected the "squat" type data for around two minutes for model training, and 20 seconds of data for model testing.
For incorrect lifting ("bent over") data, I have collected 2 minutes 30 seconds of data for model training, and another 30 seconds of data for model testing.
In Edge Impulse, on the Create Impulse section, set the "window sampling size" to 4000ms and the "window increase size" is also set as 4000 ms. The preprocessing is selected as "raw" data.
In Model training, I have used sequential dense neural networks, and the learning rate is set to 0.005 and the training cycle is 200 epochs.
After training was complete, the model achieved 100% accuracy and the F1 score is listed below.
The inference time is 7ms, and flash usage is only 78.4K.
In model testing, I have used the data that we collected and set aside earlier to test the model. Here, the model achieved 87% accuracy. This data is completely new and was not used in training sessions, so it is unseen up to now. The decrease in model accuracy does sometimes occur, it looks like the improper sitting position is being incorrectly classified as "squat" data.
Go to the Deployment section in Edge Impulse, and select the firmware option for the Thunderboard Sense 2. This will generate and download the firmware files to your system.
Once the firmware is downloaded, copy the .bin
file and paste it in the TB004
drive (or whatever drive label appears when the Thunderboard is connected to your computer). This will flash the software on to the Thunderboard.
Once it is flashed, reset the board, and connect a 3V battery to it.
To test it in a real scenario, download the LightBlue application from the Apple App Store or Google Play Store. This application will be used to communicate to the Thunderboard Sense 2 over Bluetooth.
Open the App and connect to the Edge Impulse service (Make sure board is powered up).
Some settings in the App might need to be changed to the following values:
Subscribe to the 2A56
characteristic.
Decode the message as UTF8
(click on HEX
in the top right corner in LightBlue to switch).
Connect the wearable belt and start doing different movements.
Enable the "Listening" option in the App. You will be notified only when the previous prediction result differs from the current prediction result.
You can see the results of the predictions displayed in the App:
Then open up a Terminal and run the below command to see the model inference in realtime:
edge-impulse-run-impulse
This TinyML-based wearable can be used in manufacturing warehouses, shipping, or other situations where employees lift objects on a regular basis. While this is a proof-of-concept, this type of approach could help them to correct their posture via local notifications from the mobile application.
A TinyML-based wearable device which can be fitted on a patient’s finger for communicating with caretakers.
Created By: Manivannan Sivan
Public Project Link: https://studio.edgeimpulse.com/public/147925/latest
Some hospital patients, elderly people, or patients require constant monitoring might need support at any time. However, they may have difficulty communicating due to injuries, mental ability, energy level / effort, glucose level, or other reasons. It can also be challenging for caretakers to tend to all patients.
I have created a wearable using a SiLabs Thunderboard Sense 2, which can be fitted to the patient’s finger. The patient can call the caretakers by tapping their finger, or rotating it, and the tinyML model running on the hardware will predict the gesture and communicate to the caretakers through BLE communication. The caretakers can get notified in the Light Blue Application on their devices.
I have trained a model with different tap actions and normal hand movements, so that it can classify the normal movements and emergency tap options.
The model will predict an action in any of these categories:
Help
Emergency
Water
Idle
Random Movements
Now let’s see how I trained the model and tested it on real hardware in detail.
Connect the Thunderboard Sense 2 board to your system and flash the firmware from this link:
Once it is flashed, run the below command:
edge-impulse-daemon
Now your board is connected to your Edge Impulse account. I have used a cloth finger cover and attached the SiLabs board with a rubber band.
To get any support from caretakers, the patient can use this gesture.
For gesture - "Help", I have just tapped my hand on the flat surface gently with 1 second delay. Let's say for each second, I have done one tap action. I have collected 2 minutes of "Help" data for training and 20 seconds of data for testing.
For "Emergency" action, I have continuously tapped the finger with a SiLabs board on a flat surface for five times without any delay. I repeated this process for about 2 minutes to collect enough training data, and another 20 seconds for testing data.
For basic needs like water, food etc. the patient can use this gesture to communicate. This helps caretakers understand the needs in advance and bring water to them.
Lift the hand slightly from the surface, and move it sideways left and right a few times. Again, I have collected 2 minutes of data for model training, and 20 seconds data for testing.
Idle action is when the patient is sleeping or keeping their hands idle for some time. This data is collected and trained so that it is differentiated from other actions.
The model is trained on other movement data like walking, combing hair, getting a drink, etc. Again this is for differentiation.
Now we have 10 minutes and 20 seconds of data. And the data is split into a 84:16 ratio as training and testing data.
In the Create Impulse section inside the Edge Impulse Studio, the sampling window size is set to 4000ms and window increase size also increases to 4000ms.
I have selected spectral features as a preprocessing block, and the Generated Features are shown below:
In Model training, I have used sequential dense neural networks, and the learning rate is set to 0.0005 and the training cycle is 100.
After training, the model achieved 100% accuracy, and the F1 score is listed below:
In Model testing, I have used the Testing data that we set aside earlier.
The model achieved 96% in model testing data. This data was completely new and not used in training sessions. The decrease in model accuracy that we noticed does sometimes happen, in this case it looks like some of the "Random" movements were identified as "Help" actions.
Go to the Deployment section and select Firmware option - Thunderboard Sense 2. This will download the firmware to your system.
Once the Firmware file is downloaded, copy the .bin
file and paste it in the TB004
drive. This will flash the software onto the Thunderboard Sense 2 board. Once it is flashed, reset the board. Connect the 3v battery into it.
To test out a real scenario, download the LightBlue App from the Apple App Store or Google Play Store. This app will be used to communicate with the Thunderboard Sense 2.
Open the App and connect to the Edge Impulse service (Make sure board is powered up).
Change a few settings in the app:
Subscribe to the 2A56 characteristic.
Decode the message as UTF8 (click on HEX in the top right corner in LightBlue to switch).
Connect the wearable to your finger and start performing the different gesture actions.
Enable the "listening" option in the app, as well. You will be notified only when the previous prediction result differs from the current prediction result.
Sample results in the LightBlue app are shown below:
For every action , the predicted result with a timestamp is displayed in the app.
This TinyML-based wearable can be used by patients who can't easily communicate with caretakers for various reasons.
Use machine learning and an Arduino Nano BLE Sense to monitor bed occupancy in hospitals or care facilities.
Created By: Adam Milton-Barker
Public Project Link: https://studio.edgeimpulse.com/public/181529/latest
Hospitals can benefit greatly from the use of modern technologies and automation. As hospitals continue to struggle through lack of staff, they need to explore ways that tasks can be automated to free up their valuable human resources. AI is one technology that can play a huge role in the automation of hospitals, and with platforms such as Edge Impulse and low cost embedded devices, automation can be implemented easily, and at low cost.
Using the built-in sensors of the Arduino Nano 33 BLE Sense and the Edge Impulse platform, beds can be monitored to see if they are occupied or not, meaning that hospital staff can know in real-time whether or not they have room for a patient, and exactly where the free beds are. This project is a proof of concept to show how Edge Impulse can be used to train a custom neural network, which can be deployed to an Arduino Nano 33 BLE Sense.
Arduino Nano 33 BLE Sense Buy
Edge Impulse Visit
Head over to Edge Impulse and create your account or login. Once logged in you will be taken to the project selection/creation page.
Your first step is to create a new project. From the project selection/creation you can create a new project.
Enter a project name, select Developer or Enterprise and click Create new project.
We are going to be creating a project that uses the built in accelerometer, gyroscope and magnetometer sensors, so now we need to select Accelerometer Data as the project type.
You need to install the required dependencies that will allow you to connect your device to the Edge Impulse platform.
This process is documented on the Edge Impulse Website and includes installing:
Once the dependencies are installed, connect your device to your computer and press the RESET button twice to enter into bootloader mode, the yellow LED should now be flashing.
Now download the latest Edge Impulse firmware and unzip it, then double click on the relevant script for your OS, either flash_windows.bat
, flash_mac.command
or flash_linux.sh
.
Once the firmware has been flashed you should see the output above, hit Enter
to close command prompt/terminal.
Open a new command prompt/terminal, and enter the following command:
If you are already connected to an Edge Impulse project, use the following command:
Follow the instructions to log in to your Edge Impulse account.
Once complete head over to the Devices tab of your project and you should see the connected device.
We are going to create our own dataset, using the built-in sensors on the Arduino Nano 33 BLE Sense. We are going to collect data that will allow us to train a machine learning model that can detect sitting down on a bed, standing up, and idle state.
We will use the Record new data feature on Edge Impulse to record around 35 samples of each class.
Connect your Arduino Nano 33 BLE Sense to the Edge Impulse platform, and connect it to the side of your bed. Once you have this in place, set the label to Vacant and record yourself standing up from the bed around 35 times.
Now you need to do the same for sitting down on the bed, repeat the steps above and change the label to Occupied before recording your samples.
Finally, record the Idle data, change the sample length to 1000 (1 second), and record your data, ensuring no movement is detected during sampling.
As the samples we took for the Vacant and Occupied classes were taken with a sample length of 5 seconds, we will need to trim them down to around 1 second.
To do this, click the three dots at the side of the samples and reduce the length by pulling bars at the edges of the samples closer together. Make sure you get the correct data in the sample. For Vacant, you should have the part of the sample where the signals go up, and vice-a-versa for the Occupied samples.
We now need to split the data into training and test data. To do so head to the project dashboard and scroll down to the bottom. Click on the Perform train/test split button and follow the instructions.
Now we are going to create our network and train our model. Head to the Create Impulse tab. Next click Add processing block and select Spectral Analysis.
Now click Add learning block and select Classification.
Now click Save impulse.
Head over to the Spectral Features tab and click on the Save parameters button to save the parameters.
If you are not automatically redirected to the Generate features tab, click on the Spectral Features tab and then click on Generate features and finally click on the Generate features button.
Your data should be nicely clustered and there should be as little mixing of the classes as possible. You should inspect the clusters and look for any data that is clustered incorrectly. If you find any data out of place, you can relabel or remove it. If you make any changes click Generate features again.
Now we are going to train our model. Click on the Classifier tab then click Start training.
Once training has completed, you will see the results displayed at the bottom of the page. Here we see that we have 100% accuracy. Lets test our model and see how it works on our test data.
Head over to the Model testing tab where you will see all of the unseen test data available. Click on the Classify all and sit back as we test our model.
You will see the output of the testing in the output window, and once testing is complete you will see the results. In our case we can see that we have achieved 100% accuracy on the unseen data.
Before we deploy the software to the Nano 33 BLE Sense, lets test using the Edge Impulse platform whilst connected to the board. For this to work make sure your device is currently connected.
Use the Live classification feature to record some samples for classification from the Nano BLE Sense. Your model should correctly identify the class for each sample.
Now we will deploy an Arduino library to our device that will allow us to run the model directly on our Arduino Nano 33 BLE Sense.
Head to the deployment tab and select Arduino Library then scroll to the bottom and click Build.
Note that the EON Compiler is selected by default which will reduce the amount of memory required for our model.
Once the library is built, you will be able to download it to a location of your choice.
Once you have downloaded the library, open up Arduino IDE, click Sketch -> Include library -> Upload .ZIP library..., navigate to the location of your library, upload it and then restart the IDE.
Open the IDE again and go to File -> Examples, scroll to the bottom of the list, go to Hospital_Bed_Occupancy_Detection_inferencing -> nano_ble33_sense -> nano_ble33_sense_fusion.
Once the script opens you will see:
You need to click on each of the library links and install them before you can compile the program and upload. Once you have done this, upload the script to the board, open up Serial Monitor, and you will see the output from the program.
Now you can test your program by staying idle, sitting down and standing up. Check the output to see how your program is doing. It should correctly identify each action.
We can use the versioning feature to save a copy of the existing network. To do so head over to the Versioning tab and click on the Create first version button.
This will create a snapshot of your existing model that we can come back to at any time.
Here we have shown how the Edge Impulse platform combined with the power of the Arduino Nano 33 BLE Sense can be used to create a simple solution that could help hospitals become more efficient. Further work could include more recognized motions or movements, toggling of LEDs or lights via a pin on the Nano 33, or notification systems leveraging bluetooth to talk to an application or dashboard.
A TinyML model using Edge Impulse and a SiLabs Thunderboard Sense 2 to monitor warehouse package handling with BLE.
Created By: Manivannan Sivan
Public Project Link:
Monitoring fragile objects during shipment require more time, care, labor effort, and infrastructure support.
In an ideal scenario, shipments would travel from the warehouse directly into the hands of the customer. However, that is not always the case. In between, a shipment / package will be handled many times, and you can’t always expect the people doing the transportation to handle them in a manner that will not lead to damage.
Here are a few challenges with logistics of fragile items:
In a warehouse, storing fragile objects in the wrong position can damage the product.
No real time monitoring of the shipments inside a warehouse about placement of products.
Handling the shipment in the wrong position during transferring to the destination might cause damage to the product.
Computer Vision-based shipment monitoring requires infrastructure to support it. For example, poor lightning conditions might affect the prediction.
I have created a TinyML model which can be attached to a fragile package and it tracks and predicts the position of the package, with the results communicated to a Mobile application through BLE.
The predicted result will be one of four categories:
Correct position
Incorrect position - Upside-down
Incorrect position - Tilted forward
Incorrect position - Tilted backward
Connect the Thunderboard Sense 2 board to the system and flash the firmware from this link:
Once it is flashed, run the below command:
edge-impulse-daemon
Now the board is connected to your Edge Impulse account. Then place the hardware in a case, and attach it to the shipment package. In the below picture, you'll notice that the Thunderboard is placed inside the case and mounted on top of the package. Now collect the accelerometer data.
Now it’s time to collect the dataset for each of the categories.
Let’s assume the package with fragile object is orientated correctly. Place the Thunderboard on top of it, for data acquisition. Now start moving the package with this position, while data acquisition is running. Collect a dataset of around 2 minutes 30 seconds.
Turn the package over so that it is upside down, with the hardware still on it. Now start moving the package in this position, while data acquisition is running. Collect a dataset of around 2 minutes 30 seconds.
Turn the package on it's side, with the hardware still on it, facing forward. Now start moving the package in this position, while data acquisition is running. Collect a dataset of around 2 minutes 30 seconds.
Turn the package on it's side, with the hardware still on it, facing backward. Now start moving the package in this position, while data acquisition is running. Collect a dataset of around 2 minutes 30 seconds.
After data collection is completed, now go the Impulse settings and select Raw data in Processing. A Window size of 3000ms and Window increase of 500 ms is good.
The generated Feature map from the raw data should look similar to this:
In the Neural network training, I have used sequential layers with a Dense neural network layer and drop out of 0.1 to avoid the over fitting. I have chosen the learning rate as 0.0005 and 200 training cycles.
Upon completion, the training model Accuracy is 100%:
In the Model testing section, I have tested the trained model with the 30-second dataset that we set aside earlier for each category.
The trained model achieved 100% accuracy using the unseen testing data. This confirms the model performs great in predictions. Now the next stage is to test it in real use-case scenario.
For this, we need to deploy this model directly to the SiLabs Thunderboard Sense 2.
Go to the Deployment section, and select Firmware option "Thunderboard Sense 2". This will generate and download the firmware to your local system.
Once the firmware is downloaded, copy the .bin
file and paste it in the TB004 drive attached to your PC. This will flash the software onto the Thunderboard Sense 2 board.
Once it is flashed, Reset the board. Connect the 3v battery into it.
Open the App and connect to the Edge Impulse service (Make sure board is powered on).
Input settings in the application:
Subscribe to the 2A56
characteristic.
Decode the message as UTF-8 (click on HEX
in the top right corner in LightBlue to switch).
Place the package in different positions and start moving the package.
The app will notify the Thunderboard.
Enable the "Listening" option in the app. You will be notified only when the previous prediction result differs from current prediction result.
For different positions of a shipment package, the predicted resulted is mentioned below.
We have seen that a TinyML-based model using accelerometer input will able to predict the placement / orientation of fragile shipments, to help avoid damage or monitor packages in a warehouse.
Train an ML model that can perform gesture recognition using computer vision on the OpenMV Cam H7.
Created By: Wamiq Raza
Public Project Link:
The vision-based technology of hand gesture recognition is an important part of human-computer interaction. Technologies, such as speech recognition and gesture recognition, receive great attention in the field of Hyperconverged Infrastructure (HCI). The initial problem was originally tackled by the computer vision community by means of images and videos. But more recently, the introduction of low- cost consumer depth cameras have opened the way to several different approaches. These new approaches exploit the depth information acquired by these devices which improve gesture recognition performance.
The literature study gives insight into the many strategies that may be considered and executed to achieve hand gesture recognition. It also assists in comprehending the benefits and drawbacks of the various strategies. The literature review is separated into two parts: the detection module and the camera module.
In the literature data, gloves, hand belts, and cameras have shown to be the most often utilized techniques of gathering user input. In many research articles, the technique of gesture recognition employs input extraction using data gloves, a hand belt equipped with an accelerometer, and Bluetooth to read hand motions. For pre-processing the image, a variety of approaches were used; algorithms and techniques for noise removal, edge identification, and smoothening. This was then followed by several segmentation techniques for boundary extraction, such as separating the foreground from the background. A standard 2D camera was used for gesture recognition. Earlier in technology, it was thought that a single camera may not be as effective as stereo or depth aware cameras, but some companies are challenging this theory. For this reason, using Edge Impulse [1] framework, a Software-based gesture recognition technology using a standard 2D camera that can detect robust hand gestures was built. Additionally, the range of image-based gesture recognition systems may raise concerns about the technology's practicality for broad application. For example, an algorithm calibrated for one camera may not function with another device or camera. In order to cope with this challenge, a dataset creation during a class was taken from dataset [2] and validated this approach using a FOMO [3] algorithm.
The objective was to develop an integrated architecture that implemented on microcontrollers and was able to recognize hand gesture, optimizing the model’s performance. As for the TinyML platform, we chose an OpenMV [4] microcontroller, which acted as a decision unit. The OpenMV (shown in Figure 1) is a small, low power microcontroller that enables the easy and intuitive implementation of image processing applications. It can be programmed using high-level Python scripts (Micro-Python) and is driven by an STM32H74VI ARM Cortex M7 processor running at 480 MHz, suitable for most machine vision applications. OpenMV was particularly suitable for our proposed approach due to its low power consumption and simple algorithms that will run between 25-50 FPS on QVGA (320x240) resolutions and below. It is equipped with a high-performance camera that we used to collect data for the mission purposes.
This project is being built from scratch and will use an OpenMV microcontroller as pictured above in Figure 1. The first step consisted of creating the data needed to run the training model. For this step, OpenMV was used with the built-in camera and OpenMV IDE for dataset creation. In total, 30 images were captured from my hand, showing three different gestures which were then split into three folders, each folder with its unique class name. All the prepared training images were then stored in dataset folders. In addition, four from the [2] superb class were taken to compare results with the created dataset and testing data, which was not taken through the microcontroller.
HORNS, contains 30 images of hand pointing horn
INDEX, contains 30 images of index finger
TWO, contains 30 images of hand showing two fingers
SUPERB, contains 30 images of superb
To create a dataset using OpenMV IDE, firstly connect OpenMV to your laptop using the USB cable. Click on the connect button in order to connect to the default data acquisition program. Once successfully connected, you can start taking images of the object that will be saved in the defined class folder.
Figure 2 represents the steps to follow to create a dataset directory. Figure 3 represents the dataset folder with each class consisting of images with a unique ID.
Once the dataset is created, all images would be uploaded to Edge Impulse for labeling. Figure 4 represents the Edge Impulse platform on how to upload the data for labelling before it processes. Figure 5 represents the labeled image for class horns.
The images in the dataset are now labeled. To train a model, FOMO algorithm was then used. As FOMO (Faster Objects, More Objects) is a unique machine learning approach that extends object identification to devices with limited processing power, it allows you to count things, locate objects in an image, and track numerous objects in real time while consuming up to 30x less computing power and memory than MobileNet SSD or YOLOv5. Dataset visualization and separability of the classes is presented in Figure 6. Even after rescaling and color conversions, image features have a high dimensionality that prevents suitable visualization. Each image was resized to 48 x 48 pixels, in addition to that, data augmentation technique was applied.
The number of epochs is the number of times the entire dataset is passed through the neural network during training. There is no ideal number for this, it depends on the data in total. The model was run for 60 epochs with learning rate 0.001 with the dataset split into training, validation, and testing.
After introducing a dynamic quantization from a 32-bit floating point to an 8-bit integer, the resulting optimized model showed a significant reduction in size (75.9KKB). The onboard inference time was reduced to 70 msec and the use of RAM was limited to 63.9 KB, with an accuracy after the post-training validation of 87.8%. The model confusion matrix and on a mobile device performance can be seen in Figure 7.
In order to deploy a model on a microcontroller, Figure 8 represents the block diagram. The red bounding box is the steps where first model is trained on given data, after that model is converted to a tflite file then deployed on a microcontroller.
Here in our case, we must build firmware using the Edge Impulse platform. Figure 9 represents the steps for OpenMV with red bounding boxes. Impulses can be deployed as a C++ library. You can include this package in your own application to run the impulse locally. We will have three files in the zip folder from Edge Impulse: a python micro script, label as txt and tflite file. Once we have the tflite file we can deploy that on our microcontroller. In our case, we use OpenMV. Copy the tflite and label file from folder. Next, paste it into OpenMV disk and open python micro script file in OpenMV IDE and start inference. For further details of OpenMV development refer to [5].
To test the model, images of hand gestures were split during the processing steps. Through the live testing on Edge Impulse’s website, the input image was taken as a parameter and is able to predict the class it belongs to. Before passing the image, we need to ensure that we are using the same dimensions that we used during the training phase. Here, the image is by default the same dimension. Figure 10 represents the results of different class live testing. Figures 11 and 12 represent device testing results for two different classes.
In this project, we have built a developed recognition model based on a FOMO algorithm. The result shows that the accuracy of the proposed algorithm on a TinyML device is up to 87.8%. However, since the proposed method's effectiveness is low, the gesture dataset is insufficient. As a result, we can improve the accuracy of recognition and detection steps with more data and classes.
Memo, L. Minto, P. Zanuttigh, "Exploiting Silhouette Descriptors and Synthetic Data for Hand Gesture Recognition", STAG: Smart Tools & Apps for Graphics, 2015
For hardware, I have used the and Edge Impulse to train the model and deploy it to the Thunderboard Sense 2.
Back in the Studio, move approximately 30 seconds of data from each category to Test data. The ratio should be split about 80:20.
To test it out in a real scenario, download the app from the iOS App Store or Google Play store. This app will be used to communicate to with the Thunderboard Sense 2.
A TinyML project using an Arduino Portenta H7 and Edge Impulse to start a baby swing when crying is heard.
Created By: Manivannan Sivan
Public Project Link: https://studio.edgeimpulse.com/public/134216/latest
On a typical baby swing rocker only a manual ON/OFF switch, or a swing timer option function can be chosen. But when a baby starts to cry in the middle of the night, the baby swing rocker is normally in the "Off" position, so it cannot calm the baby.
I have automated the baby swing movement with a TinyML model. An Arduino is running audio inferencing and will classify the baby crying sound from room noise and activate the baby swing rocker motor, so that it can calm the baby and help them fall asleep again.
The above block diagram explains the overall architecture of the project. The TinyML model is trained in Edge Impulse, and then deployed back to an Arduino Portenta H7.
I have collected baby crying sounds and background room noise for TinyML model training, trained the model with enough data to provide good accuracy, and finally integrated the system with the swing hardware.
Arduino Portenta H7
Portenta Vision Shield
Baby swing rocker
For data acquisition, I have collected real baby crying sounds and room noise. I have used an Arduino Portenta H7 with the Vision Shield to collect audio samples. The Arduino Portenta Vision Shield is used because it contains two MP34DT05 microphones which run on 16 MHz.
For initial setup of the Portenta, follow the steps mentioned in this link
Once the Portenta firmware is flashed to the Arduino Portenta H7 hardware, then open the command window in your system and run the below command:
edge-impulse-daemon
The Portenta is now connected to Edge Impulse. I have collected a dataset of real baby crying sounds and some normal room noise.
I have collected a datasets of around 2 mins and 41 seconds. This dataset is used for the model training.
Inside your Edge Impulse account, in the Create Impulse section, the Window Sampling is selected as 2500ms and Window Increase is set as 500ms. I have configured spectrogram as a Preprocessing block.
I have used a sequential neural network layer. The input data in 2 Dimensions is reshaped into 1 Dimension using a reshape layer. Then a 1 Dimensional convolution layer with Max pooling is used.
The below diagram demonstrates the 1D convolution layer and max pooling filtering.
The 1D max pooling block moves a pool (window) of a set size over the incoming data with a set stride, computing the maximum in each specific window. The below diagram demonstrates the max pooling technique in 1D input data.
In our network, the max pooling is configured as MaxPooling1D(pool_size=2, strides=2, padding='same')
. This means the pool size is 2, where it takes 2 indices values and outputs the maximum value in that. And the stride length is 2, so it moves the pool layer twice in one direction.
I have configured the training cycles as 100 with a learning rate of 0.005, and achieved good accuracy.
Upon completion of training, we can see that we have achieved 98.2% accuracy.
After the model is trained with good accuracy, I have tested with new data. I have used two datasets for each category (baby cry and room noise) for testing. In testing with unseen data, the model achieved 100% accuracy.
Then next step is deployment to the hardware.
Once the testing is complete, go to the "Deployment" option and select Build firmware -> Arduino Portenta H7 to create a downloadable firmware to flash to the board. I have chosen Quantized (Int8). In Edge Impulse, there is also an option to use the EON compiler for reducing resources and improving accuracy, as well as lower latency.
Once the build process is completed, the firmware will be packaged in a Zip file and downloaded.
Press the Reset button on the Portenta H7 twice to set it to "Flash mode" and then open the .bat file (if your are using Windows) or run the Mac version if that is your platform, to flash the firmware.
Once the flash is completed, open a new terminal window and run the below command to start inference on the device: edge-impulse-run-impulse
The above step will tell us whether the model is able to run smoothly on real hardware. After this step, now comes the real challenge! We need to integrate the Portenta H7 into the baby swing rocker!
For this, we need to deploy this model as source code and add our application on top of it.
So, we click the Create Library section in the Studio and select Arduino, then download the source code.
Open the Arduino IDE and select Sketch -> Include library and Add .Zip Library. Then select the downloaded Zip file on your machine.
After including the Library, go to Examples and select portent_h7_microphone_continuous
I have written the application code on top of the default code in the example.
In the application code, I wrote the logic to activate a relay which is connected to the motor in the baby swing rocker. The below flowchart explains the logic of the application code.
Also, I have added my application code in below GitHub link, which you can directly copy and paste into your Arduino IDE.
https://github.com/Manivannan-maker/smartbabyswingrocker
The application code will activate the baby swing rocker for 20 seconds, whenever it detects the baby crying sound.
The Arduino Portenta is connected to the 5v DC Relay module. The Common pin in the relay is connected to the Gnd of the battery and NO pin in the relay is connected to the Gnd of the motor in the baby swing rocker whereas the Vcc of the motor is connected directly to the Battery positive terminal.
After the wiring is complete, you can test it in real time. I have played the baby crying sound on my phone and after few seconds, the baby swing rocker started to swing. The YouTube video embedded above demonstrates the working demo of this prototype.
A primer on using a Transformer-based model on a low-powered, resource-constrained microcontroller-based wearable that detects falls.
Created By: Naveen Kumar
Public Project Link:
Falls are a major health concern for older people. The number of fall-related deaths increased significantly in recent years, of which around 80% of the involved persons are age 65 or older. Falls can result in physical and psychological trauma, especially for the elderly. To improve the quality of life of our seniors this project presents the development of a fall-detection wearable device. The main aim of this project is to showcase a working demo of an edge AI device that uses a Transformer-based model.
A Transformer is a deep learning model that adopts the mechanism of self-attention, differentially weighting the significance of each part of the input data. Self-attention, sometimes called intra-attention is an attention mechanism relating different positions of a single sequence to compute a representation of the sequence. Like recurrent neural networks (RNNs), transformers are designed to process sequential input data with applications for tasks such as translation and text summarization. However, unlike RNNs, transformers process the entire input all at once. ChatGPT, a large language model, also uses Transformer blocks in its architecture. In this project, the Transformer model is applied to time-series data instead of natural language.
This project requires a low-powered yet capable MCU to run a Transformer model with a reasonable inferencing rate. The Arduino Giga R1 is a good fit for our purpose since it has a powerful MCU with plenty of memory. Also, we will be using the SeeedStudio Grove 3-axis accelerometer (ADXL345) and a proto-board shield to connect the accelerometer firmly to the development board.
Collecting data for different kinds of activities of daily living (ADL) and falls is a time-consuming and laborious task. It needs many people from different age groups and requires a lot of man-hours to curate the datasets. Fortunately, there are many high-quality public datasets available for similar kinds of data. We have used the SisFall: A Fall and Movement Dataset, which is a dataset of falls and ADL acquired with an accelerometer. The dataset contains 19 types of ADLs and 15 types of falls. It includes acceleration and rotation data from 38 volunteers divided into two groups: 23 adults between 19 and 30 years old, and 15 elderly people between 60 and 75 years old. Data was acquired with three sensors (2 accelerometers and 1 gyroscope) at a frequency sample of 200 Hz. For this project, We are using acceleration data from one of the sensors. Also, I am using the same accelerometer (ADXL345) with the same configuration which was used for data collection. The datasets are available in the raw format and can be downloaded from the link given in the paper below.
A sample of the data is shown below. Only the first 3 columns are used, which are 3-axis accelerometer data from the ADXL345 sensor.
Each 3-axis accelerometer data (x, y, z) are converted to gravity using the following conversion equation.
We need to create a new project to upload data to Edge Impulse Studio.
Also, we need the API and HMAC keys for the Edge Impulse Studio project to generate signatures for the data acquisition format. We can copy the keys from the Dashboard > Keys [tab] in the Edge Impulse Studio dashboard.
The accelerometer data is divided into two classes, ADL and FALL, and are converted to m/s^2 before uploading to the Edge Impulse Studio. Below is the Python script that converts the raw accelerometer data into the data acquisition JSON format required by the Edge Impulse Studio.
To execute the script above save it to format.py
and run the commands below. It is assumed that the SisFall dataset has been downloaded to the SisFall_dataset
directory.
The converted data acquisition JSON is shown below. The sample rate is reduced to 50 Hz which is sufficient to predict fall events and also helps in reducing the model size, therefore interval_ms
is set to 20 (ms).
The JSON files are prefixed with the label name (e.g. FALL.F10_SA07_R01.json) by the script above so that the label name is inferred automatically by the CLI. The command below is used to upload all JSON files to training datasets.
We could have used --category split to automatically split the data into training and testing datasets, but we need to segment the sample so it is uploaded there for convenience. We can see the uploaded datasets on the Edge Impulse Studio's Data Acquisition page.
The uploaded FALL event data have mixed motion events before and after the fall event which are removed by splitting the segments. The ADL category data are used without any modifications.
We can do a split by selecting each sample and clicking on the Split sample from the drop-down menu, but it is time-consuming and tedious work. Fortunately, there is an Edge Impulse SDK API that can be used to automate the whole process. After some experimentation, we have chosen a 4000 ms segment length which is the optimal length for detecting falls.
To execute the script above save it to a segments.py
file and run the command below.
After segmenting the dataset we can split it into training and testing sets by clicking the Perform train / test split button on the Edge Impulse Studio dashboard.
Go to the Impulse Design > Create Impulse page, click Add a processing block, and then choose Raw Data, which uses the data without pre-processing and relies on deep learning to learn features. Also, on the same page, click Add a learning block, and choose Classification, which learns patterns from data and can apply these to new data. We have chosen a 4000ms Window size and a 4000ms Window increase, which means we are using a single frame. Now click on the Save Impulse button.
Next, go to the Impulse Design > Raw Data page and click the Save parameters button.
Clicking on the Save parameters button redirects to another page where we should click on the Generate Feature button. It usually takes a couple of minutes to complete Feature generation.
In the case of the Raw data processing block, the Feature generation does not change the data. It divides them into given windows size only. In the image below, we can see the Raw features and the Processed features are the same.
We can see the complete view of all data on the Dashboard > Data Explorer page.
To define the neural network architecture, go to the Impulse Design > Classifier page and click on the Switch to Keras(expert) mode as shown below.
The 4000ms of 3-axis accelerometer raw time-series data are fed into the Input layer. We have added a Normalize layer with pre-calculated mean and variance for each channel from the training datasets. The Transformer model is capable to learn features from the raw time series data while training.
Below is the final model summary.
The complete training code is given below.
Now click the Start Training button and wait a few minutes until the training is completed. We can see the training results below. The quantized (int8) model has 96.4% accuracy.
We can test the model on the test datasets by going to the Model testing page and clicking on the Classify All button. The model has 97.32% accuracy on the test datasets, so we are confident that the model should work on new data.
At the Deployment page, we will choose the Create Library > Arduino library option.
For the Select optimizations option, we will choose Enable EON Compiler, which reduces the memory usage of the model. Also, we will opt for the Quantized (Int8) model.
Now click the Build button, and in a few seconds the library bundle will be downloaded to your local computer.
After the board package installation is completed, choose the Arduino Giga R1 from Tools > Board > Arduino Mbed OS Giga boards menu and select the serial port of the connected board from the Tools > Port menu.
Below is the Arduino sketch for inferencing. For continuous motion event detection, the application uses two threads on the MCU's main core , one for inferencing and another for data sampling so that no events should miss.
To run the inferencing sketch, import the downloaded library bundle using the menu Sketch > Include Library > Add.ZIP Library in the Arduino IDE. Create a new Sketch with the code above and compile/upload the firmware to the connected Arduino Giga R1 board. We can monitor the inferencing output using Tools > Serial Monitor with a baud rate of 115200 bps. The inferencing rate is 142ms which is pretty impressive.
Although the Arduino Giga R1 WiFi has an onboard WiFi and Bluetooth chip which can be used to send out alert notifications, for demo purposes, whenever a fall event is detected the onboard red LED turns on. The device is mounted on a belt and worn at the waist. The accelerometer orientation is kept the same (Y-axis downward and Z-axis coming out from the wearer) as when the training data was collected.
This project presents a proof-of-concept device that is easy to use for elderly people. This project also showcases that a Transformer-based neural network can be used to solve complex problems without any signal processing, and can be run on inexpensive, low-powered, and resource-constrained devices.
Teach your smartwatch to recognize different movements and motions of your watch hand.
Created By: Thomas Vikström
Public Project Link:
In this tutorial you will learn how to get started with Machine Learning on your Bangle.js smartwatch. Specifically you will build and train a model learning to recognize different movements of your watch hand. The steps include how to collect data, how to use Edge Impulse for the machine learning part, and how to finally upload the learned model back to the watch and utilize it there.
[Bangle JS, version 1 or 2](https://shop.espruino.com/banglejs2
Theoretically the Bangle Emulator might work as well, but you can’t of course collect real accelerometer or heart rate data with an emulator!
Computer with Bluetooth (BLE)
used to split a file with samples into separate .CSV-files for importing into Edge Impulse
not strictly necessary, but very useful if you want to collect lots of samples
Notepad, Notepad++, Excel etc. can also be used to manually split files, not feasible with lots of samples
This part will guide you how to use your watch to collect multiple samples for one gesture type at a time.
Pair your computer with the watch using Espruino Web IDE
the code will create a text file in the watch memory
Name the event you are going to collect samples for by changing the line event="left";
use e.g. event="left";
for twitching your watch hand left and later on event="right";
for the opposite direction
upload the code to RAM. Do not upload this code to flash or storage, you might in worst case need to reset the watch completely.
Perform the gesture
repeat the gesture many times, the more the merrier!
wait a second between each
the gesture collecting code will append each sample to the .CSV-file
a graph will also be shown on your watch screen
Repeat steps 3-4 above, remember to change event="<gesture>";
where <gesture>
is the hand movement you will collect
The devil is in the details, do not e.g. remove the seemingly insignificant semi-colon ;
!
This part will guide you how to transfer the .CSV-files from your watch to your computer.
In Espruino Web IDE, click the Storage icon (4 discs) in the middle of the screen
Search for your file/files, they start with the event name you provided in earlier steps e.g. left.1.csv (StorageFile)
Click on Save
(the floppy disc icon) for one file at a time and save the files to a folder of your choice, e.g. to c:\temp
This part will guide you how to split the .CSV-files you've downloaded from your watch into separate .CSV-files. The reason for this is that Edge Impulse requires one .CSV-file per sample.
Replace the path on the second line (starting with PATENTS = ...
) with the full path and filename for the first file you want to split. I.e. the file you downloaded in previous steps.
Run the code in your Python editor
The program will search for the string 'timestamp, x, y, z'
in the original file and for each time (= sample) it finds, create a new file.
If you don't use Python, you'd need to split the file for each sample using some other method, manual or automatic. Remember that the samples aren't all of the same size so the amount of rows will vary.
You should now have several .CSV-files in the folder you chose. The files will be named like left.1.csv (StorageFile)-15.csv
where -15
at the end is a running number.
Repeat steps 2-3 above for each file you downloaded from your watch.
Create a new project and give it a name, why not Bangle.js
Select Accelerometer data
when asked for the type of data you are dealing with.
Click Let's get started
Select Data acquisition
from the left hand menu
Click on the icon labeled Upload existing data
Click on Choose files
Navigate to the folder you used to store the .CSV-files (e.g. c:\temp)
Select all the sample files that were created earlier, but not the original files you downloaded from your watch. I.e. select only the .CSV-files with a number at the end of the file name, e.g. left.1.csv (StorageFile)-0.csv
.
You can also upload smaller batches at a time
Automatically split between training and testing
and Infer from filename
should both be selected
Click Begin upload
- this will now quickly upload the files to your project.
The upload process is shown on the right side, if everything goes well, you should at the end see a message like this: Done. Files uploaded successful: 85. Files that failed to upload: 0. Job completed
Take a look at a sample by selecting any row
Notice that the labels (left
and right
in this example) were automatically inferred from the filenames you used.
Always strive to get a roughly similar amount of samples for each gesture. You can see the balance in the pie graph on the left.
Also notice that Edge Impulse split the sample files so that approximately 80 % will be used for training and 20 % for testing purposes.
Through the four small icons you can filter your data, select multiple items, upload more data or see a slightly more detailed list view. With the help of these you can e.g. mass delete many files at a time.
An impulse takes raw data, uses signal processing to extract features, and then uses a learning block to classify new data. These steps will create an impulse.
Click Create impulse
Change the window size and increase according to the screenshot below.
Add the Raw Data
processing block
Add the Classification (Keras)
learning block
Click Save Impulse
Note that you often need to tweak one or several of the settings, this is depending on what you want to achieve and the quality & quantity of your data.
Click Raw data
from the left hand menu
You will see a graph of one of the samples as well as the raw features.
In this case you don't need to change anything, so click Save parameters
which will take you to the second tab.
Click Generate features
This processes the samples
After a while you will see a graph in the Feature explorer
. This gives you a 3D view of how well your data can be clustered into different groups. In an ideal situation all similar samples should be clustered into same group with a clear distinction between groups. If that's not the case, no worries at this point, the neural network algorithm will in many cases still be able to do a very good job!
Here you will train the neural network and analyse its performance.
Click NN Classifier
from the left hand menu
Change the Number of training cycles
to 100. This is another parameter to tweak, the higher this number is, the longer time the training will take, but also the better the network will perform, at least until it can't improve anymore.
Click on Start training
Within a few minutes, depending on the number of labels and data quantity you have, the training will finish.
The graph shows the training performance and accuracy. While 100 % looks like a perfect score, it isn't necessary so. The reason is that the network might perform poorly in real situations when confronted with sample data not seen before.
Here you will download the trained model to your computer.
Click Dashboard
from the left hand menu
Scroll down to the section Download block output
and click on the icon next to NN Classifier model TensorFlow Lite (int8 quantized)
The float32 model might sometimes perform slightly better than the int8 model, but it requires more memory and might cause Bangle.js to crash because of this.
Save the file to a folder of your choice
Transfer the trained model to Bangle.js from your computer
This part will guide you how to transfer the model file from your computer to Bangle.js.
In Espruino Web IDE, click the Storage icon (4 discs) in the middle of the screen
Click Upload a file
Select the model file you downloaded from Edge Impulse
Change the filename to .tfmodel
and click Ok
Create a text file, e.g. with Notepad
Write the event names in alphabetical order, separated by commas, e.g. left,right
Save the file to a folder of your choice
In Espruino Web IDE, click the Storage icon (4 discs) in the middle of the screen
Select the file you just created
Change the filename to .tfnames
and click Ok
Finally you will be able to test how well the trained model performs in real life! Just a few steps left.
Paste the below code into the right side in Espruino Web IDE
Upload the code to RAM
This short program will trigger your watch to sense movements and try to recognise which movement it was.
The recognised movement, e.g. left
or right
, will be shown in the left window in Espruino Web IDE as well as on your watch display.
First of all, hopefully you with this short tutorial were successful in training and recognizing gesture events from your Bangle.js. Hopefully it also inspires you to try to improve the performance, e.g. by collecting more samples, by collecting more event types or by tweaking the different parameters and settings in Edge Impulse.
We will be using Edge Impulse Studio for model creation and training. You'll need to sign up for a free account at and create a project to get started.
The data is uploaded using the Edge Impulse CLI. Please follow the instructions to install the CLI here: .
The key building block of a Transformer model is the Keras layer. As part of a recent release the Edge Impulse SDK now supports this layer. The Transformer based models are usually large models. The Arduino Giga R1 WiFi has 1 MB RAM divided into 2 cores (M7/M4). The main core (M7) has 512 KB RAM. To fit the model into the available memory with other overheads we needed to slim down the architecture by defining 1 transformer block with 2 attention heads (size = 64). Also, reducing the dimension (units) of the penultimate Dense layer helps in keeping the model size within the limits. The aforementioned hyperparameters have been chosen after many training trials and keeping the optimal model size, without losing much accuracy.
Please follow the instructions to download and install the Arduino IDE. After installation, open the Arduino IDE and install the board package for the Arduino Giga R1 WiFi by going to Tools > Board > Boards Manager. Search the board package as shown below and install it.
Get the watch up and running by following these … and connected by these
Create an Edge Impulse account for free
for information about how to install or use Python, check e.g.
Install the app Gesture Test
on your watch from the
Paste the below Gesture collection code into the right side in Espruino Web IDE (adapted from )
Copy the below Python code (shamelessly copied from ) into your favourite Python editor.
In this part you will learn how to upload the sample files you've created earlier, create a machine learning model, train and finally analyse it. This tutorial will only cover the essential steps needed for Bangle.js. To learn more about Edge Impulse, see e.g. and .
Log in to , using the credentials for the free account you created in the beginning.
Take an existing accelerometer model built for the Thunderboard Sense 2, and prepare it for use on the SiLabs xG24 board.
Created By: Salman Faris
Public Project: https://studio.edgeimpulse.com/public/188507/latest
In this project I'm going to walkthrough how to port an existing project developed on the SiLabs Thunderboard Sense 2, to SiLabs' newer and more powerful xG24 development board.
The original project was developed by Manivnnan Sivan to detect correct / incorrect posture of manufacturing workers using a wearable belt.
I will walk you through how you can clone his Public Edge Impulse project, deploy to a SiLabs Thunderboard Sense 2, test it out, and then build and deploy to the newer SiLabs xG24 device instead.
You can find more about the project here in the original project documentation, Worker Safety Posture Detection.
The project is intended to help workers in manufacturing. They work in conditions that can put a lot of stress on their bodies. Depending on the worker's role in the production process, they might experience issues related to cramped working conditions, heavy lifting, or repetitive stress.
Poor posture can cause problems for the health of those who work in manufacturing. Along with that, research suggests that making efforts to improve posture among manufacturing employees can lead to significant increases in production. Workers can improve their posture by physical therapy, or simply by being more mindful during their work day.
Manivnnan Sivan has created a wearable device using a SiLabs Thunderboard Sense 2 which can be fitted to a worker's waist. The worker can do their normal activities, and the TinyML model running on the hardware will predict the posture and communicate to the worker through BLE communication. The worker can get notified in the Light Blue App on their phone or smartwatch.
Before porting, we need to run the project on the existing platform to understand how it's run and familiarize ourselves with it's parameters. So let's get started.
Before you proceed further, there are few other software packages you need to install.
Edge Impulse CLI - Follow this link to install necessary tooling to interact with the Edge Impulse Studio.
LightBlue - This is a mobile application. Install from either Apple Store or Android / Google Play. This will be required to connect the board wirelessly over Bluetooth. The Android version can be found here: https://play.google.com/store/apps/details?id=com.punchthrough.lightblueexplorer&hl=en_IN&gl=US&pli=1. Apple / iOS users can download the App here: https://apps.apple.com/us/app/lightblue/id557428110.
Go to the Edge Impulse project page using the link here, and clone it.
Click Clone on the right corner button to create a copy of the project.
Provide a project name in the text field, and click on the Clone project button.
Done, the project is successfully cloned into your Edge Impulse account:
As we clone the project, it will be loaded with the dataset collected by Manivannan.
We can try to deploy the project on a Thunderboard Sense 2:
Connect the development board to your computer : Use a micro-USB cable to connect the development board to your computer. The development board should mount as a USB mass-storage device (like a USB flash drive), with the name TB004
. Make sure you can see this drive.
Update the firmware : The development board does not come with the right firmware yet. To update the firmware:
2.1 Download the latest Edge Impulse firmware.
2.2 Drag the silabs-thunderboard-sense2.bin
file to the TB004 drive.
2.3 Wait 30 seconds.
Next, open the CLI and run edgeimpulse-daemon
From here, Log in with your Edge Impulse credentials and choose the cloned project from the followed project list.
Then provide a name for the device that is connected to the computer.
After completing these steps, you will see that the device is connected to the Edge Impulse Studio via the CLI.
Choose "Deployment" from the left toolbar and then choose the target device as "SiLabs Thunderboard Sense 2", and click "Build" to start the process.
After the build completes, a .bin
file will be automatically generated and downloaded. You need to drag and drop it to the Thunderboard Sense 2 device drive.
Done, we can now open the LightBlue mobile app to run and see the inference:
Alternatively you can run it on a computer, if you don't have access to a phone. Run the command below to see if the tinyML model is inferencing.
edge-impulse-run-impulse
Nice, so far we cloned and implemented the project. Now we are going to port to the new board!
SiLabs have launched the new EFR32MG24 also known as xG24 Wireless SoCs and they are full of interesting sensors and features making them very good for mesh IoT wireless connectivity using Matter, OpenThread, and Zigbee protocols for smart home, lighting, and building automation products or any other use case you see fit to this combination of sensors and connectivity.
The sensors present onboard are an accelerometer, a microphone, environmental sensors comprising temperature, humidity, and air pressure, a Hall sensor, an inertial and an interactional sensor.
Compared to the Thunderboard Sense 2, the xG24 does have some changes while connecting to Edge Impulse and uploading the firmware.
First - Download the base firmware image - Download the latest Edge Impulse firmware, and unzip the file. Once downloaded, unzip it to obtain the firmware-xg24.hex file, which we will be using in the following steps.
Connect the xG24 Dev Kit to your computer - Use a micro-USB cable to connect the xG24 Dev Kit to your development computer, and download and install Simplicity Commander.
Load the base firmware image with Simplicity Commander - You can use Simplicity Commander to flash your xG24 Dev Kit with the Edge Impulse base firmware image. To do this, first select your board from the dropdown list on the top left corner:
Then go to the "Flash" section on the left sidebar, and select the base firmware image file you downloaded in the first step above (i.e., the file named firmware-xg24.hex
). You can now press the Flash button to load the base firmware image onto the xG24 Dev Kit.
After this, we can follow the usual method on how we select the device in the Edge Impulse Studio via the CLI.
Next, open the CLI and run edgeimpulse-daemon
From here, log in with your Edge Impulse credentials and choose the cloned project from the project list that follows.
Then provide a name for the device that is connected to the computer.
After doing these steps, you can see the device is connected to the Edge Impulse Studio via the CLI.
I collected a bit of additional data using the xG24, and it looks good.
Now, we need to retrain the data set, but before that we need to set the target as SiLabs EFR32MG24.
Great, we got nice accuracy in model training.
Now, we can build the model and deploy to the xG24. For the build, we need to choose the correct device first, then click "Build".
After generating the .bin
file, we need to use Simplicity Commander to flash your xG24 Dev Kit with this firmware. To do this, first select your board from the dropdown list on the top left corner:
Then go to the "Flash" section on the left sidebar, and select the generated firmware image file you downloaded after the build process. You can now press the Flash button to load the generated .hex file firmware image onto the xG24 Dev Kit.
Next, we can use the LightBlue mobile app to run and see the inference.
Alternatively, we can run on computer as we did for the Thunderboard Sense 2, if you don't have access to a phone. Run the command below to see if the tinyML model is inferencing.
edge-impulse-run-impulse
Awesome, we have now successfully ported a project from Thunderboard Sense 2 to the xG24 Dev Kit!
We can see here, the xG24 does a faster classification of these tinyML datasets without compromising the accuracy.
Here you can see the comparison data, and we can see 91.1765% increased inferencing speed in the NN Classifier, while the RAM and Flash usage are the same.
Similar results are achieved in the field data when we are inferencing the live data stream. Here we can see a 92.3077% increase in speed in the classification, which is more than what was calculated in the model optimization.
To conclude the porting project post, we can confirm that it's worth to upgrade products and projects using the Thunderboard Sense 2 to the new and efficient SiLabs xG24 Dev Kit.