LogoLogo
HomeDocsAPI & SDKsProjectsForumStudio
  • Welcome
    • Featured Machine Learning Projects
      • Getting Started with the Edge Impulse Nvidia TAO Pipeline - Renesas EK-RA8D1
      • Smart City Traffic Analysis - NVIDIA TAO + Jetson Orin Nano
      • ROS 2 Pick and Place System - Arduino Braccio++ Robotic Arm and Luxonis OAK-D
      • Optimize a cloud-based Visual Anomaly Detection Model for Edge Deployments
      • Rooftop Ice Detection with Things Network Visualization - Nvidia Omniverse Replicator
      • Surgery Inventory Object Detection - Synthetic Data - Nvidia Omniverse Replicator
      • NVIDIA Omniverse - Synthetic Data Generation For Edge Impulse Projects
      • Community Guide – Using Edge Impulse with Nvidia DeepStream
      • Computer Vision Object Counting - Avnet RZBoard V2L
      • Gesture Appliances Control with Pose Detection - BrainChip AKD1000
      • Counting for Inspection and Quality Control - Nvidia Jetson Nano (TensorRT)
      • High-resolution, High-speed Object Counting - Nvidia Jetson Nano (TensorRT)
    • Prototype and Concept Projects
      • Renesas CK-RA6M5 Cloud Kit - Getting Started with Machine Learning
      • TI CC1352P Launchpad - Getting Started with Machine Learning
      • OpenMV Cam RT1062 - Getting Started with Machine Learning
      • Getting Started with Edge Impulse Experiments
  • Computer Vision Projects
    • Workplace Organizer - Nvidia Jetson Nano
    • Recyclable Materials Sorter - Nvidia Jetson Nano
    • Analog Meter Reading - Arduino Nicla Vision
    • Creating Synthetic Data with Nvidia Omniverse Replicator
    • SonicSight AR - Sound Classification with Feedback on an Augmented Reality Display
    • Traffic Monitoring - Brainchip Akida
    • Multi-camera Video Stream Inference - Brainchip Akida
    • Industrial Inspection Line - Brainchip Akida
    • X-Ray Classification and Analysis - Brainchip Akida
    • Inventory Stock Tracker - FOMO - BrainChip Akida
    • Container Counting - Arduino Nicla Vision
    • Smart Smoke Alarm - Arduino Nano 33
    • Shield Bot Autonomous Security Robot
    • Cyclist Blind Spot Detection - Himax WE-I Plus
    • IV Drip Fluid-Level Monitoring - Arduino Portenta H7
    • Worker PPE Safety Monitoring - Nvidia Jetson Nano
    • Delivered Package Detection - ESP-EYE
    • Bean Leaf Disease Classification - Sony Spresense
    • Oil Tank Measurement Using Computer Vision - Sony Spresense
    • Object Counting for Smart Industries - Raspberry Pi
    • Smart Cashier with FOMO - Raspberry Pi
    • PCB Defect Detection with Computer Vision - Raspberry Pi
    • Bicycle Counting - Sony Spresense
    • Counting Eggs with Computer Vision - OpenMV Cam H7
    • Elevator Passenger Counting - Arduino Nicla Vision
    • ESD Protection using Computer Vision - Seeed ReComputer
    • Solar Panel Defect Detection - Arduino Portenta H7
    • Label Defect Detection - Raspberry Pi
    • Dials and Knob Monitoring with Computer Vision - Raspberry Pi
    • Digital Character Recognition on Electric Meter System - OpenMV Cam H7
    • Corrosion Detection with Computer Vision - Seeed reTerminal
    • Inventory Management with Computer Vision - Raspberry Pi
    • Monitoring Retail Checkout Lines with Computer Vision - Renesas RZ/V2L
    • Counting Retail Inventory with Computer Vision - Renesas RZ/V2L
    • Pose Detection - Renesas RZ/V2L
    • Product Quality Inspection - Renesas RZ/V2L
    • Smart Grocery Cart Using Computer Vision - OpenMV Cam H7
    • Driver Drowsiness Detection With FOMO - Arduino Nicla Vision
    • Gastroscopic Image Processing - OpenMV Cam H7
    • Pharmaceutical Pill Quality Control and Defect Detection
    • Deter Shoplifting with Computer Vision - Texas Instruments TDA4VM
    • Smart Factory Prototype - Texas Instruments TDA4VM
    • Correct Posture Detection and Enforcement - Texas Instruments TDA4VM
    • Visual Anomaly Detection with FOMO-AD - Texas Instruments TDA4VM
    • Surface Crack Detection and Localization - Texas Instruments TDA4VM
    • Surface Crack Detection - Seeed reTerminal
    • Retail Image Classification - Nvidia Jetson Nano
    • SiLabs xG24 Plus Arducam - Sorting Objects with Computer Vision and Robotics - Part 1
    • SiLabs xG24 Plus Arducam - Sorting Objects with Computer Vision and Robotics - Part 2
    • Object Detection and Visualization - Seeed Grove Vision AI Module
    • Bike Rearview Radar - Raspberry Pi
    • Build a Self-Driving RC Vehicle - Arduino Portenta H7 and Computer Vision
    • "Bring Your Own Model" Image Classifier for Wound Identification
    • Acute Lymphoblastic Leukemia Classifier - Nvidia Jetson Nano
    • Hardhat Detection in Industrial Settings - Alif Ensemble E7
    • Motorcycle Helmet Identification and Traffic Light Control - Texas Instruments AM62A
    • Import a Pretrained Model with "Bring Your Own Model" - Texas Instruments AM62A
    • Product Inspection with Visual Anomaly Detection - FOMO-AD - Sony Spresense
    • Visual Anomaly Detection in Fabric using FOMO-AD - Raspberry Pi 5
    • Car Detection and Tracking System for Toll Plazas - Raspberry Pi AI Kit
    • Visual Anomaly Detection - Seeed Grove Vision AI Module V2
    • Object Counting with FOMO - OpenMV Cam RT1062
    • Visitor Heatmap with FOMO Object Detection - Jetson Orin Nano
    • Vehicle Security Camera - Arduino Portenta H7
  • Audio Projects
    • Occupancy Sensing - SiLabs xG24
    • Smart Appliance Control Using Voice Commands - Nordic Thingy:53
    • Glass Window Break Detection - Nordic Thingy:53
    • Illegal Logging Detection - Nordic Thingy:53
    • Illegal Logging Detection - Syntiant TinyML
    • Wearable Cough Sensor and Monitoring - Arduino Nano 33 BLE Sense
    • Collect Data for Keyword Spotting - Raspberry Pi Pico
    • Voice-Activated LED Strip - Raspberry Pi Pico
    • Snoring Detection on a Smart Phone
    • Gunshot Audio Classification - Arduino Nano 33 + Portenta H7
    • AI-Powered Patient Assistance - Arduino Nano 33 BLE Sense
    • Acoustic Pipe Leakage Detection - Arduino Portenta H7
    • Location Identification using Sound - Syntiant TinyML
    • Environmental Noise Classification - Nordic Thingy:53
    • Running Faucet Detection - Seeed XIAO Sense + Blues Cellular
    • Vandalism Detection via Audio Classification - Arduino Nano 33 BLE Sense
    • Predictive Maintenance Using Audio Classification - Arduino Nano 33 BLE Sense
    • Porting an Audio Project from the SiLabs Thunderboard Sense 2 to xG24
    • Environmental Audio Monitoring Wearable - Syntiant TinyML - Part 1
    • Environmental Audio Monitoring Wearable - Syntiant TinyML - Part 2
    • Keyword Spotting - Nordic Thingy:53
    • Detecting Worker Accidents with Audio Classification - Syntiant TinyML
    • Snoring Detection with Syntiant NDP120 Neural Decision Processor - Arduino Nicla Voice
    • Recognize Voice Commands with the Particle Photon 2
    • Voice Controlled Power Plug with Syntiant NDP120 (Nicla Voice)
    • Determining Compressor State with Audio Classification - Avnet RaSynBoard
    • Developing a Voice-Activated Product with Edge Impulse's Synthetic Data Pipeline
    • Enhancing Worker Safety using Synthetic Audio to Create a Dog Bark Classifier
  • Predictive Maintenance and Defect Detection Projects
    • Predictive Maintenance - Nordic Thingy:91
    • Brushless DC Motor Anomaly Detection
    • Industrial Compressor Predictive Maintenance - Nordic Thingy:53
    • Anticipate Power Outages with Machine Learning - Arduino Nano 33 BLE Sense
    • Faulty Lithium-Ion Cell Identification in Battery Packs - Seeed Wio Terminal
    • Weight Scale Predictive Maintenance - Arduino Nano 33 BLE Sense
    • Fluid Leak Detection With a Flowmeter and AI - Seeed Wio Terminal
    • Pipeline Clog Detection with a Flowmeter and AI - Seeed Wio Terminal
    • Refrigerator Predictive Maintenance - Arduino Nano 33 BLE Sense
    • Motor Pump Predictive Maintenance - Infineon PSoC 6 WiFi-BT Pioneer Kit + CN0549
    • BrickML Demo Project - 3D Printer Anomaly Detection
    • Condition Monitoring - Syntiant TinyML Board
    • Predictive Maintenance - Commercial Printer - Sony Spresense + CommonSense
    • Vibration Classification with BrainChip's Akida
    • AI-driven Audio and Thermal HVAC Monitoring - SeeedStudio XIAO ESP32
  • Accelerometer and Activity Projects
    • Arduino x K-Way - Outdoor Activity Tracker
    • Arduino x K-Way - Gesture Recognition for Hiking
    • Arduino x K-Way - TinyML Fall Detection
    • Posture Detection for Worker Safety - SiLabs Thunderboard Sense 2
    • Hand Gesture Recognition - OpenMV Cam H7
    • Arduin-Row, a TinyML Rowing Machine Coach - Arduino Nicla Sense ME
    • Fall Detection using a Transformer Model – Arduino Giga R1 WiFi
    • Bluetooth Fall Detection - Arduino Nano 33 BLE Sense
    • Monitor Packages During Transit with AI - Arduino Nano 33 BLE Sense
    • Smart Baby Swing - Arduino Portenta H7
    • Warehouse Shipment Monitoring - SiLabs Thunderboard Sense 2
    • Gesture Recognition - Bangle.js Smartwatch
    • Gesture Recognition for Patient Communication - SiLabs Thunderboard Sense 2
    • Hospital Bed Occupancy Detection - Arduino Nano 33 BLE Sense
    • Porting a Posture Detection Project from the SiLabs Thunderboard Sense 2 to xG24
    • Porting a Gesture Recognition Project from the SiLabs Thunderboard Sense 2 to xG24
    • Continuous Gait Monitor (Anomaly Detection) - Nordic Thingy:53
    • Classifying Exercise Activities on a BangleJS Smartwatch
  • Air Quality and Environmental Projects
    • Arduino x K-Way - Environmental Asthma Risk Assessment
    • Gas Detection in the Oil and Gas Industry - Nordic Thingy:91
    • Smart HVAC System with a Sony Spresense
    • Smart HVAC System with an Arduino Nicla Vision
    • Indoor CO2 Level Estimation - Arduino Portenta H7
    • Harmful Gases Detection - Arduino Nano 33 BLE Sense
    • Fire Detection Using Sensor Fusion and TinyML - Arduino Nano 33 BLE Sense
    • AI-Assisted Monitoring of Dairy Manufacturing Conditions - Seeed XIAO ESP32C3
    • AI-Assisted Air Quality Monitoring - DFRobot Firebeetle ESP32
    • Air Quality Monitoring with Sipeed Longan Nano - RISC-V Gigadevice
    • Methane Monitoring in Mines - Silabs xG24 Dev Kit
    • Smart Building Ventilation with Environmental Sensor Fusion
    • Sensor Data Fusion with Spresense and CommonSense
    • Water Pollution Detection - Arduino Nano ESP32 + Ultrasonic Scan
    • Fire Detection Using Sensor Fusion - Arduino Nano 33 BLE Sense
  • Novel Sensor Projects
    • 8x8 ToF Gesture Classification - Arduino RP2040 Connect
    • Food Irradiation Dose Detection - DFRobot Beetle ESP32C3
    • Applying EEG Data to Machine Learning, Part 1
    • Applying EEG Data to Machine Learning, Part 2
    • Applying EEG Data to Machine Learning, Part 3
    • Liquid Classification with TinyML - Seeed Wio Terminal + TDS Sensor
    • AI-Assisted Pipeline Diagnostics and Inspection with mmWave Radar
    • Soil Quality Detection Using AI and LoRaWAN - Seeed Sensecap A1101
    • Smart Diaper Prototype - Arduino Nicla Sense ME
    • DIY Smart Glove with Flex Sensors
    • EdgeML Energy Monitoring - Particle Photon 2
    • Wearable for Monitoring Worker Stress using HR/HRV DSP Block - Arduino Portenta
  • Software Integration Demos
    • Azure Machine Learning with Kubernetes Compute and Edge Impulse
    • ROS2 + Edge Impulse, Part 1: Pub/Sub Node in Python
    • ROS2 + Edge Impulse, Part 2: MicroROS
    • Using Hugging Face Datasets in Edge Impulse
    • Using Hugging Face Image Classification Datasets with Edge Impulse
    • Edge Impulse API Usage Sample Application - Jetson Nano Trainer
    • MLOps with Edge Impulse and Azure IoT Edge
    • A Federated Approach to Train and Deploy Machine Learning Models
    • DIY Model Weight Update for Continuous AI Deployments
    • Automate the CI/CD Pipeline of your Models with Edge Impulse and GitHub Actions
    • Deploying Edge Impulse Models on ZEDEDA Cloud Devices
Powered by GitBook
On this page
  • Introduction
  • Project Overview
  • Why the Nordic Thingy:53? Platform Continuity.
  • Hardware Requirements
  • Software Requirements
  • Dataset Collection
  • Impulse Design
  • Spectral Features
  • Classifier
  • Anomaly Detection (K-means)
  • Deployment
  • Conclusion

Was this helpful?

Edit on GitHub
Export as PDF
  1. Accelerometer and Activity Projects

Continuous Gait Monitor (Anomaly Detection) - Nordic Thingy:53

A wearable for continuous gait analysis, aiming to detect gait abnormalities indicative of potential medical conditions.

PreviousPorting a Gesture Recognition Project from the SiLabs Thunderboard Sense 2 to xG24NextClassifying Exercise Activities on a BangleJS Smartwatch

Last updated 4 months ago

Was this helpful?

Created By: Samuel Alexander

Public Project Link:

Introduction

Subtle changes in gait can be early indicators of various medical conditions, including neurodegenerative diseases like Parkinson's, multiple sclerosis, balance disorders, and even other injuries with far-reaching health consequences. Early detection often relies on subtle changes in how a person walks, such as reduced speed, shuffling steps, or unsteadiness. Unfortunately, current assessments primarily rely on periodic, in-clinic observations by healthcare professionals, potentially missing subtle yet significant changes occurring between visits. Moreover, subjective self-assessments of gait are often unreliable. This lack of continuous, objective monitoring hinders timely diagnoses, limits the effectiveness of treatment plans, and makes it difficult to track the progression of gait-related conditions. A proactive, data-driven solution is needed to ensure individuals and their healthcare providers have the information necessary for informed decision-making.

Image Credit: Can Tunca, "Human Gait Cycle", 2017, via mdpi.com

Project Overview

This project aims to develop a wearable device for early gait disorder detection. We'll begin by collecting data representing normal gait patterns during walking, running, and standing. Next, we'll extract relevant features using Edge Impulse tools, focusing on characteristics like leg swing acceleration, stride length, and foot placement (supination/pronation). Employing Edge Impulse's K-means anomaly detection block and feature importance analysis, the device will learn to distinguish healthy gait patterns (based on the individual's established baseline) from potential anomalies. Initially, inference results will be displayed on a smartphone app. This proof-of-concept can be expanded into a wearable device that alerts users of gait abnormalities and trends, recommending healthcare consultations when appropriate. Ultimately, our goal is to provide a proactive tool for early disorder identification, enabling timely intervention and improved outcomes.

Why the Nordic Thingy:53? Platform Continuity.

The Nordic Thingy:53 leverages the nRF5340 Arm Cortex-M33 SoC, providing the computational resources necessary for on-device AI inference. It also includes a built-in accelerometer to capture detailed gait data and Bluetooth 5.4 for wireless communication. Importantly, the same nRF5340 chip powers the nRF5340 Development Kit, providing a consistent hardware platform throughout the project's development cycle. This means we can easily prototype on the Thingy:53, refine algorithms and sensor selections on the Development Kit, and ultimately transition to a custom wearable design for mass production – all using the same core chip. This approach ensures a smooth and efficient development process.

Hardware Requirements

  • Nordic Thingy:53

  • 3D printer

Software Requirements

  • Edge Impulse CLI

  • nRF Programmer App (iPhone/Android)

  • nRF Connect Desktop

Dataset Collection

The Thingy:53 was used for collecting a dataset for training the AI model to establish a baseline of normal, healthy gait patterns. This dataset includes three types of movement: standing, walking, and running. To capture realistic data, the user wore the device while performing these activities. The dataset's variety helps the model accurately classify different gait patterns and detect potential abnormalities in various situations.

Collect data for each label (standing, walking, running) using the nRF Connect app. Choose:

  • Sensor: Accelerometer

  • Sample Length (ms): 20000

  • Frequency (Hz): 20

For each label, we collected 13 repetitions of 20000 ms which equals to 260 seconds for each label. This seems to be plenty enough for our testing, however more data may be necessary if the gait patterns are performed with a larger variety of terrains.

Split the 20000 ms sample into 4 sections of 5000 ms windows.

Perform a train/test split if needed, or try to aim for approximately an 80/20 ratio.

Impulse Design

After thorough testing, including using the EON Tuner, the optimal settings for our time series data were determined. We employ both a classifier and K-means anomaly detection to enable both gait pattern classification and anomaly scoring.

Spectral Features

Spectral analysis transforms raw accelerometer data from the time domain into the frequency domain. This reveals hidden patterns in gait data, such as stride frequency, step regularity, and harmonic components of movement patterns. These extracted spectral features can provide a richer representation of gait characteristics for the neural network, often leading to improved classification accuracy and a clearer understanding of potential gait abnormalities.

Classifier

As mentioned, these parameter settings are already using optimized values from the EON Tuner.

These are our results before using the EON Tuner (default parameter values and settings).

The EON Tuner is a valuable tool for finding the best parameter settings and model architecture to maximize accuracy. While it can also optimize for performance or memory usage, our project has sufficient resources in these areas. Therefore, we prioritize accuracy as the primary optimization goal.

Anomaly Detection (K-means)

K-means clustering is chosen for gait anomaly detection due to its computational efficiency and ability to robustly identify distinct clusters. While Gaussian Mixture Models (GMMs) can model more complex data distributions, in our testing K-means excels in identifying distinct clusters like normal walking, running, and standing.

We chose L1 Root Mean Square (RMS) for anomaly detection with accelerometer data (accX, accY, accZ) due to its sensitivity to outliers and interpretability. L1 RMS emphasizes large deviations, which helps identify significant gait abnormalities and provides insights into the specific directions of those anomalies. It's also more robust to noisy accelerometer data compared to L2 RMS.

Deployment

Now the AI model is ready to be deployed to the Edge. Nordic Thingy:53 is selected for our deployment option. For this project we chose unoptimized (float32) to preserve accuracy since our hardware has enough performance and memory headroom.

Conclusion

This project successfully demonstrates the potential for wearable AI solutions in early detection of gait disorders. By harnessing the Thingy:53's capabilities and Edge Impulse's streamlined workflow, we developed a device capable of identifying gait anomalies. This tool offers proactive health monitoring, with the potential to alert users to subtle changes that may foreshadow underlying medical conditions. Future work could expand the dataset for greater robustness, explore additional sensor modalities, and conduct clinical trials to thoroughly validate the system for diagnostic use.

See this project in action:

A 3D printed shoe clip-on case modification is made for attaching the Thingy:53 to a shoe. You can download the .stl files here:

This project assumes basic familiarity with connecting the Thingy:53 to Edge Impulse via the nRF Connect app. If needed, refer to this guide for assistance:

After building our model, we'll get the new firmware. Follow this guide to flash the firmware:

https://www.thingiverse.com/thing:6558382
https://docs.edgeimpulse.com/docs/edge-ai-hardware/mcu/nordic-semi-thingy53
https://docs.edgeimpulse.com/docs/edge-ai-hardware/mcu/nordic-semi-thingy53#updating-the-firmware
https://studio.edgeimpulse.com/public/366723/live
01-cover
02-gait
03-project
04-thingy
05-dataset-collection
06-cad
07-mod
08-shoe
09-nrfconnect
10-collect-data
11-split-sample
12-complete-dataset
13-train-test-split
14-create-impulse
15-spectral-features
16-tuned-classifier
17-tuned-testing
18-initial-results
19-eon-tuner-settings
20-eon-tuner
21-anomaly-detection-settings
22-anomaly-vertical-forward
23-anomaly-forward-sideways
24-deployment
25-write-firmware-nrfconnect