LogoLogo
HomeDocsAPI & SDKsProjectsForumStudio
  • Welcome
    • Featured Machine Learning Projects
      • Getting Started with the Edge Impulse Nvidia TAO Pipeline - Renesas EK-RA8D1
      • Smart City Traffic Analysis - NVIDIA TAO + Jetson Orin Nano
      • ROS 2 Pick and Place System - Arduino Braccio++ Robotic Arm and Luxonis OAK-D
      • Optimize a cloud-based Visual Anomaly Detection Model for Edge Deployments
      • Rooftop Ice Detection with Things Network Visualization - Nvidia Omniverse Replicator
      • Surgery Inventory Object Detection - Synthetic Data - Nvidia Omniverse Replicator
      • NVIDIA Omniverse - Synthetic Data Generation For Edge Impulse Projects
      • Community Guide – Using Edge Impulse with Nvidia DeepStream
      • Computer Vision Object Counting - Avnet RZBoard V2L
      • Gesture Appliances Control with Pose Detection - BrainChip AKD1000
      • Counting for Inspection and Quality Control - Nvidia Jetson Nano (TensorRT)
      • High-resolution, High-speed Object Counting - Nvidia Jetson Nano (TensorRT)
    • Prototype and Concept Projects
      • Renesas CK-RA6M5 Cloud Kit - Getting Started with Machine Learning
      • TI CC1352P Launchpad - Getting Started with Machine Learning
      • OpenMV Cam RT1062 - Getting Started with Machine Learning
      • Getting Started with Edge Impulse Experiments
  • Computer Vision Projects
    • Workplace Organizer - Nvidia Jetson Nano
    • Recyclable Materials Sorter - Nvidia Jetson Nano
    • Analog Meter Reading - Arduino Nicla Vision
    • Creating Synthetic Data with Nvidia Omniverse Replicator
    • SonicSight AR - Sound Classification with Feedback on an Augmented Reality Display
    • Traffic Monitoring - Brainchip Akida
    • Multi-camera Video Stream Inference - Brainchip Akida
    • Industrial Inspection Line - Brainchip Akida
    • X-Ray Classification and Analysis - Brainchip Akida
    • Inventory Stock Tracker - FOMO - BrainChip Akida
    • Container Counting - Arduino Nicla Vision
    • Smart Smoke Alarm - Arduino Nano 33
    • Shield Bot Autonomous Security Robot
    • Cyclist Blind Spot Detection - Himax WE-I Plus
    • IV Drip Fluid-Level Monitoring - Arduino Portenta H7
    • Worker PPE Safety Monitoring - Nvidia Jetson Nano
    • Delivered Package Detection - ESP-EYE
    • Bean Leaf Disease Classification - Sony Spresense
    • Oil Tank Measurement Using Computer Vision - Sony Spresense
    • Object Counting for Smart Industries - Raspberry Pi
    • Smart Cashier with FOMO - Raspberry Pi
    • PCB Defect Detection with Computer Vision - Raspberry Pi
    • Bicycle Counting - Sony Spresense
    • Counting Eggs with Computer Vision - OpenMV Cam H7
    • Elevator Passenger Counting - Arduino Nicla Vision
    • ESD Protection using Computer Vision - Seeed ReComputer
    • Solar Panel Defect Detection - Arduino Portenta H7
    • Label Defect Detection - Raspberry Pi
    • Dials and Knob Monitoring with Computer Vision - Raspberry Pi
    • Digital Character Recognition on Electric Meter System - OpenMV Cam H7
    • Corrosion Detection with Computer Vision - Seeed reTerminal
    • Inventory Management with Computer Vision - Raspberry Pi
    • Monitoring Retail Checkout Lines with Computer Vision - Renesas RZ/V2L
    • Counting Retail Inventory with Computer Vision - Renesas RZ/V2L
    • Pose Detection - Renesas RZ/V2L
    • Product Quality Inspection - Renesas RZ/V2L
    • Smart Grocery Cart Using Computer Vision - OpenMV Cam H7
    • Driver Drowsiness Detection With FOMO - Arduino Nicla Vision
    • Gastroscopic Image Processing - OpenMV Cam H7
    • Pharmaceutical Pill Quality Control and Defect Detection
    • Deter Shoplifting with Computer Vision - Texas Instruments TDA4VM
    • Smart Factory Prototype - Texas Instruments TDA4VM
    • Correct Posture Detection and Enforcement - Texas Instruments TDA4VM
    • Visual Anomaly Detection with FOMO-AD - Texas Instruments TDA4VM
    • Surface Crack Detection and Localization - Texas Instruments TDA4VM
    • Surface Crack Detection - Seeed reTerminal
    • Retail Image Classification - Nvidia Jetson Nano
    • SiLabs xG24 Plus Arducam - Sorting Objects with Computer Vision and Robotics - Part 1
    • SiLabs xG24 Plus Arducam - Sorting Objects with Computer Vision and Robotics - Part 2
    • Object Detection and Visualization - Seeed Grove Vision AI Module
    • Bike Rearview Radar - Raspberry Pi
    • Build a Self-Driving RC Vehicle - Arduino Portenta H7 and Computer Vision
    • "Bring Your Own Model" Image Classifier for Wound Identification
    • Acute Lymphoblastic Leukemia Classifier - Nvidia Jetson Nano
    • Hardhat Detection in Industrial Settings - Alif Ensemble E7
    • Motorcycle Helmet Identification and Traffic Light Control - Texas Instruments AM62A
    • Import a Pretrained Model with "Bring Your Own Model" - Texas Instruments AM62A
    • Product Inspection with Visual Anomaly Detection - FOMO-AD - Sony Spresense
    • Visual Anomaly Detection in Fabric using FOMO-AD - Raspberry Pi 5
    • Car Detection and Tracking System for Toll Plazas - Raspberry Pi AI Kit
    • Visual Anomaly Detection - Seeed Grove Vision AI Module V2
    • Object Counting with FOMO - OpenMV Cam RT1062
    • Visitor Heatmap with FOMO Object Detection - Jetson Orin Nano
    • Vehicle Security Camera - Arduino Portenta H7
  • Audio Projects
    • Occupancy Sensing - SiLabs xG24
    • Smart Appliance Control Using Voice Commands - Nordic Thingy:53
    • Glass Window Break Detection - Nordic Thingy:53
    • Illegal Logging Detection - Nordic Thingy:53
    • Illegal Logging Detection - Syntiant TinyML
    • Wearable Cough Sensor and Monitoring - Arduino Nano 33 BLE Sense
    • Collect Data for Keyword Spotting - Raspberry Pi Pico
    • Voice-Activated LED Strip - Raspberry Pi Pico
    • Snoring Detection on a Smart Phone
    • Gunshot Audio Classification - Arduino Nano 33 + Portenta H7
    • AI-Powered Patient Assistance - Arduino Nano 33 BLE Sense
    • Acoustic Pipe Leakage Detection - Arduino Portenta H7
    • Location Identification using Sound - Syntiant TinyML
    • Environmental Noise Classification - Nordic Thingy:53
    • Running Faucet Detection - Seeed XIAO Sense + Blues Cellular
    • Vandalism Detection via Audio Classification - Arduino Nano 33 BLE Sense
    • Predictive Maintenance Using Audio Classification - Arduino Nano 33 BLE Sense
    • Porting an Audio Project from the SiLabs Thunderboard Sense 2 to xG24
    • Environmental Audio Monitoring Wearable - Syntiant TinyML - Part 1
    • Environmental Audio Monitoring Wearable - Syntiant TinyML - Part 2
    • Keyword Spotting - Nordic Thingy:53
    • Detecting Worker Accidents with Audio Classification - Syntiant TinyML
    • Snoring Detection with Syntiant NDP120 Neural Decision Processor - Arduino Nicla Voice
    • Recognize Voice Commands with the Particle Photon 2
    • Voice Controlled Power Plug with Syntiant NDP120 (Nicla Voice)
    • Determining Compressor State with Audio Classification - Avnet RaSynBoard
    • Developing a Voice-Activated Product with Edge Impulse's Synthetic Data Pipeline
    • Enhancing Worker Safety using Synthetic Audio to Create a Dog Bark Classifier
  • Predictive Maintenance and Defect Detection Projects
    • Predictive Maintenance - Nordic Thingy:91
    • Brushless DC Motor Anomaly Detection
    • Industrial Compressor Predictive Maintenance - Nordic Thingy:53
    • Anticipate Power Outages with Machine Learning - Arduino Nano 33 BLE Sense
    • Faulty Lithium-Ion Cell Identification in Battery Packs - Seeed Wio Terminal
    • Weight Scale Predictive Maintenance - Arduino Nano 33 BLE Sense
    • Fluid Leak Detection With a Flowmeter and AI - Seeed Wio Terminal
    • Pipeline Clog Detection with a Flowmeter and AI - Seeed Wio Terminal
    • Refrigerator Predictive Maintenance - Arduino Nano 33 BLE Sense
    • Motor Pump Predictive Maintenance - Infineon PSoC 6 WiFi-BT Pioneer Kit + CN0549
    • BrickML Demo Project - 3D Printer Anomaly Detection
    • Condition Monitoring - Syntiant TinyML Board
    • Predictive Maintenance - Commercial Printer - Sony Spresense + CommonSense
    • Vibration Classification with BrainChip's Akida
    • AI-driven Audio and Thermal HVAC Monitoring - SeeedStudio XIAO ESP32
  • Accelerometer and Activity Projects
    • Arduino x K-Way - Outdoor Activity Tracker
    • Arduino x K-Way - Gesture Recognition for Hiking
    • Arduino x K-Way - TinyML Fall Detection
    • Posture Detection for Worker Safety - SiLabs Thunderboard Sense 2
    • Hand Gesture Recognition - OpenMV Cam H7
    • Arduin-Row, a TinyML Rowing Machine Coach - Arduino Nicla Sense ME
    • Fall Detection using a Transformer Model – Arduino Giga R1 WiFi
    • Bluetooth Fall Detection - Arduino Nano 33 BLE Sense
    • Monitor Packages During Transit with AI - Arduino Nano 33 BLE Sense
    • Smart Baby Swing - Arduino Portenta H7
    • Warehouse Shipment Monitoring - SiLabs Thunderboard Sense 2
    • Gesture Recognition - Bangle.js Smartwatch
    • Gesture Recognition for Patient Communication - SiLabs Thunderboard Sense 2
    • Hospital Bed Occupancy Detection - Arduino Nano 33 BLE Sense
    • Porting a Posture Detection Project from the SiLabs Thunderboard Sense 2 to xG24
    • Porting a Gesture Recognition Project from the SiLabs Thunderboard Sense 2 to xG24
    • Continuous Gait Monitor (Anomaly Detection) - Nordic Thingy:53
    • Classifying Exercise Activities on a BangleJS Smartwatch
  • Air Quality and Environmental Projects
    • Arduino x K-Way - Environmental Asthma Risk Assessment
    • Gas Detection in the Oil and Gas Industry - Nordic Thingy:91
    • Smart HVAC System with a Sony Spresense
    • Smart HVAC System with an Arduino Nicla Vision
    • Indoor CO2 Level Estimation - Arduino Portenta H7
    • Harmful Gases Detection - Arduino Nano 33 BLE Sense
    • Fire Detection Using Sensor Fusion and TinyML - Arduino Nano 33 BLE Sense
    • AI-Assisted Monitoring of Dairy Manufacturing Conditions - Seeed XIAO ESP32C3
    • AI-Assisted Air Quality Monitoring - DFRobot Firebeetle ESP32
    • Air Quality Monitoring with Sipeed Longan Nano - RISC-V Gigadevice
    • Methane Monitoring in Mines - Silabs xG24 Dev Kit
    • Smart Building Ventilation with Environmental Sensor Fusion
    • Sensor Data Fusion with Spresense and CommonSense
    • Water Pollution Detection - Arduino Nano ESP32 + Ultrasonic Scan
    • Fire Detection Using Sensor Fusion - Arduino Nano 33 BLE Sense
  • Novel Sensor Projects
    • 8x8 ToF Gesture Classification - Arduino RP2040 Connect
    • Food Irradiation Dose Detection - DFRobot Beetle ESP32C3
    • Applying EEG Data to Machine Learning, Part 1
    • Applying EEG Data to Machine Learning, Part 2
    • Applying EEG Data to Machine Learning, Part 3
    • Liquid Classification with TinyML - Seeed Wio Terminal + TDS Sensor
    • AI-Assisted Pipeline Diagnostics and Inspection with mmWave Radar
    • Soil Quality Detection Using AI and LoRaWAN - Seeed Sensecap A1101
    • Smart Diaper Prototype - Arduino Nicla Sense ME
    • DIY Smart Glove with Flex Sensors
    • EdgeML Energy Monitoring - Particle Photon 2
    • Wearable for Monitoring Worker Stress using HR/HRV DSP Block - Arduino Portenta
  • Software Integration Demos
    • Azure Machine Learning with Kubernetes Compute and Edge Impulse
    • ROS2 + Edge Impulse, Part 1: Pub/Sub Node in Python
    • ROS2 + Edge Impulse, Part 2: MicroROS
    • Using Hugging Face Datasets in Edge Impulse
    • Using Hugging Face Image Classification Datasets with Edge Impulse
    • Edge Impulse API Usage Sample Application - Jetson Nano Trainer
    • MLOps with Edge Impulse and Azure IoT Edge
    • A Federated Approach to Train and Deploy Machine Learning Models
    • DIY Model Weight Update for Continuous AI Deployments
    • Automate the CI/CD Pipeline of your Models with Edge Impulse and GitHub Actions
    • Deploying Edge Impulse Models on ZEDEDA Cloud Devices
Powered by GitBook
On this page
  • Project Demo
  • GitHub Repo
  • Story

Was this helpful?

Edit on GitHub
Export as PDF
  1. Accelerometer and Activity Projects

Arduin-Row, a TinyML Rowing Machine Coach - Arduino Nicla Sense ME

Use an Arduino Nicla Sense ME and Edge Impulse model to determine your rowing cadence and provide feedback via the IoT Remote app.

PreviousHand Gesture Recognition - OpenMV Cam H7NextFall Detection using a Transformer Model – Arduino Giga R1 WiFi

Last updated 1 year ago

Was this helpful?

Created By: Justin Lutz

Public Project Link:

Project Demo

GitHub Repo

Story

The sport of rowing, even on a rowing machine, is a technical one. Most people can hop on a rowing machine and start rowing, but may not be generating the most power possible. Many think that rowing involves pulling the handle with your hands, arms, and back, but in order to generate the most power you actually have to push with your legs. That is where you can generate the fastest times on the rowing machine.

However, in order to know if you are doing it correctly, many times it requires a one on one session with a coach so they can evaluate your form and provide suggestions.

Using the power of accelerometer data on an Arduino board and Edge Impulse, I was able to make a virtual "Rowing Coach" that, based on the rower's tempo (and acceleration from the start of the stroke), can offer feedback. It can also offer feedback based on how the rowing handle moves through the stroke. This is to ensure the rower is keeping the handle level, and isn't raising or lowering the handle too much during the stroke, which can waste energy and reduce power. This feedback is offered through a chat-based feature of the Arduino IoT Remote app. Given that I am also using the Nicla Sense ME board, I am also reading and plotting estimated CO2 values (eCO2) to show how the CO2 values in the air change while you work out.

/*
  Edge Impulse Data Forwarder - Nicla Sense ME data.
*/
#include "Arduino_BHY2.h"
#include "Nicla_System.h"

#define CONVERT_G_TO_MS2    9.80665f
#define FREQUENCY_HZ        100
#define INTERVAL_MS         (1000 / (FREQUENCY_HZ + 1))
static unsigned long last_interval_ms = 0;

SensorXYZ accelerometerRaw(SENSOR_ID_ACC_RAW);

void setup() {
  Serial.begin(115200);

  BHY2.begin();
  accelerometerRaw.begin();

  delay(2000);
}

void loop() {
  short accX, accY, accZ;
  
  BHY2.update();

  if (millis() > last_interval_ms + INTERVAL_MS) {
    last_interval_ms = millis();
        
    accX = accelerometerRaw.x();
    accY = accelerometerRaw.y();
    accZ = accelerometerRaw.z();

      
      Serial.print(accX * CONVERT_G_TO_MS2);
      Serial.print('\t');
      Serial.print(accY * CONVERT_G_TO_MS2);
      Serial.print('\t');
      Serial.println(accZ * CONVERT_G_TO_MS2);

  }
}

So, a quick word of caution here: I started this project intending to just use the Nicla Sense ME and develop a BLE app using MIT App Inventor. However, I found out with the 64 kB RAM limitation, that I was running out of memory running my 20 kB model on the Nicla Sense ME (I believe this is due to additional packages being loaded into RAM on the Nicla reducing the available memory). Using the Nicla Sense ME as a shield on the MKR Wifi 1010, I was able to run my model without memory issues. BUUUT, when used as a shield, the accelerometer frequency maxes out at 10 Hz (it took me a while to figure this out), so I had to downsample all of the data that I collected just using Nicla Sense ME from 100 Hz to 10 Hz (and also ensure that the orientation of the Nicla remained the same). This was frankly a nightmare that took me a while to figure out.

I collected about 18 minutes of data for 3 states: easy, low strokes per minute (spm), and high-spm divided between training and test data:

Once the data was collected, I set up my impulse:

I downsampled the collected data to 10 Hz to match the output of the Nicla Sense ME as a shield. I kept the window around 2000 ms and didn't change the window step size. On the Spectral Features tab, I changed the Scale Axes to 0.001 as I seemed to get better results with that (it was also recommended on a prior project by the Edge Impulse team).

Next, it was on to training the model:

With the data that I had, there was pretty good clustering between the classes. Once that I had the model trained, I went to the Anomaly Detection tab and selected the X-axis to determine if the rowing handle is remaining level throughout the stroke.

I then added the zipped Arduino library to my application code in the Arduino IDE by going to Sketch > Include LIbrary > Add .ZIP Library...

That being said, I really like the ease of use of the IOT cloud interface. I defined a couple variables: air_quality and inference and the default sketch is auto-populated. In the Dashboards tab of the IOT cloud, I created a chat-like interface that I called "Coach's Orders". This gives you the feedback based on what your rowing stroke indicates. I also created a graph that shows the CO2 levels being read from the Nicla. The reason for that data is to show how working out affects the CO2 levels in the room. If you are working hard in a confined space and the CO2 levels rise to a dangerous level, you might want to get some ventilation or take a break.

Once I had the Dashboard set up and the variables defined, it was really just a matter of adding in the Edge Impulse inference logic to my Arduino sketch. Here is the main loop; the full code base can be seen in the code section.

void loop() {
  ArduinoCloud.update();
   
  BHY2Host.update();
  
  if (millis() > last_interval_ms + INTERVAL_MS) {
    last_interval_ms = millis();

    //get CO2 readings (air quality)
    air_quality = bsec.co2_eq();
    Serial.println(String("CO2 reading: ") + String(air_quality));

    //get accel data
    short accX,accY,accZ;
    accX = accel.x();
    accY = accel.y();
    accZ = accel.z();
    
    // fill the features buffer
    features[feature_ix++] = accX * CONVERT_G_TO_MS2;
    features[feature_ix++] = accY * CONVERT_G_TO_MS2;
    features[feature_ix++] = accZ * CONVERT_G_TO_MS2;

    // features buffer full? then classify!
    if (feature_ix == EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE) {
      ei_impulse_result_t result;

      // create signal from features frame
      signal_t signal;
      numpy::signal_from_buffer(features, EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE, &signal);

      // run classifier
      EI_IMPULSE_ERROR res = run_classifier(&signal, &result, false);
      ei_printf("run_classifier returned: %d\n", res);
      if (res != 0){
        feature_ix = 0;
        return;
      }

      // print predictions
      ei_printf("(DSP: %d ms., Classification: %d ms., Anomaly: %d ms.)",
        result.timing.dsp, result.timing.classification, result.timing.anomaly);
      //store the location of the highest classified label
      int maxLabel = 0;
      float maxValue = 0.0;
      for (size_t ix = 0; ix < EI_CLASSIFIER_LABEL_COUNT; ix++) {
        if(result.classification[ix].value > maxValue) {
          maxLabel = ix;
          maxValue = result.classification[ix].value;
        }
      }
      ei_printf("%.2f\n", result.anomaly);
      String inf = String(result.classification[maxLabel].label);
      if(inf == "easy") {
        inference = "Push with your legs!";
        Serial.println("Push with your legs!");
      }
      else if(inf == "low-spm") {
        inference = "Good power at low strokes per minute!";
        Serial.println("Good power at low strokes per minute!");
      }
      else if(inf == "hi-spm") {
        inference = "Keep the pace up.  High strokes per minute!";
        Serial.println("Keep the pace up.  High strokes per minute!");
      }
      
      if(result.anomaly < -1.0 || result.anomaly > 2.0) {
        inference = inference + "  Keep the handle level!";
          Serial.print("Anomaly triggered!");
      }

      // reset features frame
      feature_ix = 0;
    }
  }
}

Once the coding was complete, I loaded the sketch with my code on the MKR Wifi 1010. I then put the board into a breadboard and taped it to the rowing handle:

The board can be powered with either through the USB or via the JST connector with a 3.7V LiPO battery. I then hopped on the rowing machine and varied up the paces. This is what I saw on the app:

And a video with an overlay of the app with a little commentary from yours truly is up at the top of this post!

The model inference mapped pretty closely to what I was doing. If I was going light and not putting in much effort, I would get a "Push with your legs!" command. If I started to push harder but keep the tempo low, I would get a "Good power at low strokes per minute!" and if I went all out, it would say "Keep the pace up. High strokes per minute!" If I altered the height of the handle on the pull or on the return, "Keep the handle level!" would be added to the command. You can see that as I picked up the pace the CO2 values rose as well. I would be interested to see what values it reads if I'm rowing for more than just a minute.

This was a good project, and like the others I've done, had hiccups along the road that I had to overcome. I completely pivoted from a Nicla Sense ME / BLE app solution to using the Nicla as a shield on the MKR Wifi 1010 and using the Arduino IOT Cloud app as the final implementation. I spent a lot of hours combing through message boards on why I couldn't run the Edge Impulse model on the Nicla (out of memory) to why my model wasn't working with the Nicla as a shield (when used as a shield, accelerometer frequency drops from 100 Hz to 10 Hz). Hopefully this project helps you avoid some of the traps as fell in to. Happy hacking!

Image source:

This project went through multiple variations (more to come on that later), but I ultimately settled on using the Nicla Sense ME as a shield on the Arduino MKR Wifi 1010. The Nicla Sense only comes with BLE, so using it as a shield allowed me to access the Wifi of the MKR board as well as the Arduino IoT Remote dashboard, which offers a quick, easy, and slick app to link to the board. In order to turn the Nicla Sense into a shield, some soldering is involved. Arduino has a good tutorial on it .

I used to generate my TinyML model to predict the type of rowing I was doing as well as anomaly detection to determine if the rowing handle placement was correct.

Using a quick Arduino sketch loaded onto the Nicla Sense ME, I used the Edge Impulse command-line editor to read the accelerometer data directly into Edge Impulse project following their . The sketch below is how I read the accelerometer data into Edge Impulse from the Nicla.

After that I deployed the model to an Arduino library. You can find my public project if you want to look at the data.

I started my application sketch from the Arduino Web Editor since I would be linking my project to the Arduino IoT Remote app. However, since the IAQ and eCO2 values won't be read correctly unless you make some changes to the Nicla library (I've documented that ), I had to export from the web editor to my local Arduino IDE so I could use the edited Arduino_BHY2Host library.

https://github.com/jlutzwpi/Arduin-Row/tree/main
British Rowing Technique - British Rowing
here
Edge Impulse
Data Forwarder example
here
here
https://studio.edgeimpulse.com/public/90788/latest
Proper form involves pushing with your legs before pulling with your arms.
Data collection
The IOT Cloud Dashboard is easy to use and makes it easy to visualize your data.
Nicla Sense ME as a shield on the Arduino MKR Wifi 1010 on a breadboard taped to the rowing handle
Output of the rowing inference to the Arduino IOT Remote app