LogoLogo
HomeDocsAPI & SDKsProjectsForumStudio
  • Welcome
    • Featured Machine Learning Projects
      • Getting Started with the Edge Impulse Nvidia TAO Pipeline - Renesas EK-RA8D1
      • Smart City Traffic Analysis - NVIDIA TAO + Jetson Orin Nano
      • ROS 2 Pick and Place System - Arduino Braccio++ Robotic Arm and Luxonis OAK-D
      • Optimize a cloud-based Visual Anomaly Detection Model for Edge Deployments
      • Rooftop Ice Detection with Things Network Visualization - Nvidia Omniverse Replicator
      • Surgery Inventory Object Detection - Synthetic Data - Nvidia Omniverse Replicator
      • NVIDIA Omniverse - Synthetic Data Generation For Edge Impulse Projects
      • Community Guide – Using Edge Impulse with Nvidia DeepStream
      • Computer Vision Object Counting - Avnet RZBoard V2L
      • Gesture Appliances Control with Pose Detection - BrainChip AKD1000
      • Counting for Inspection and Quality Control - Nvidia Jetson Nano (TensorRT)
      • High-resolution, High-speed Object Counting - Nvidia Jetson Nano (TensorRT)
    • Prototype and Concept Projects
      • Renesas CK-RA6M5 Cloud Kit - Getting Started with Machine Learning
      • TI CC1352P Launchpad - Getting Started with Machine Learning
      • OpenMV Cam RT1062 - Getting Started with Machine Learning
      • Getting Started with Edge Impulse Experiments
  • Computer Vision Projects
    • Workplace Organizer - Nvidia Jetson Nano
    • Recyclable Materials Sorter - Nvidia Jetson Nano
    • Analog Meter Reading - Arduino Nicla Vision
    • Creating Synthetic Data with Nvidia Omniverse Replicator
    • SonicSight AR - Sound Classification with Feedback on an Augmented Reality Display
    • Traffic Monitoring - Brainchip Akida
    • Multi-camera Video Stream Inference - Brainchip Akida
    • Industrial Inspection Line - Brainchip Akida
    • X-Ray Classification and Analysis - Brainchip Akida
    • Inventory Stock Tracker - FOMO - BrainChip Akida
    • Container Counting - Arduino Nicla Vision
    • Smart Smoke Alarm - Arduino Nano 33
    • Shield Bot Autonomous Security Robot
    • Cyclist Blind Spot Detection - Himax WE-I Plus
    • IV Drip Fluid-Level Monitoring - Arduino Portenta H7
    • Worker PPE Safety Monitoring - Nvidia Jetson Nano
    • Delivered Package Detection - ESP-EYE
    • Bean Leaf Disease Classification - Sony Spresense
    • Oil Tank Measurement Using Computer Vision - Sony Spresense
    • Object Counting for Smart Industries - Raspberry Pi
    • Smart Cashier with FOMO - Raspberry Pi
    • PCB Defect Detection with Computer Vision - Raspberry Pi
    • Bicycle Counting - Sony Spresense
    • Counting Eggs with Computer Vision - OpenMV Cam H7
    • Elevator Passenger Counting - Arduino Nicla Vision
    • ESD Protection using Computer Vision - Seeed ReComputer
    • Solar Panel Defect Detection - Arduino Portenta H7
    • Label Defect Detection - Raspberry Pi
    • Dials and Knob Monitoring with Computer Vision - Raspberry Pi
    • Digital Character Recognition on Electric Meter System - OpenMV Cam H7
    • Corrosion Detection with Computer Vision - Seeed reTerminal
    • Inventory Management with Computer Vision - Raspberry Pi
    • Monitoring Retail Checkout Lines with Computer Vision - Renesas RZ/V2L
    • Counting Retail Inventory with Computer Vision - Renesas RZ/V2L
    • Pose Detection - Renesas RZ/V2L
    • Product Quality Inspection - Renesas RZ/V2L
    • Smart Grocery Cart Using Computer Vision - OpenMV Cam H7
    • Driver Drowsiness Detection With FOMO - Arduino Nicla Vision
    • Gastroscopic Image Processing - OpenMV Cam H7
    • Pharmaceutical Pill Quality Control and Defect Detection
    • Deter Shoplifting with Computer Vision - Texas Instruments TDA4VM
    • Smart Factory Prototype - Texas Instruments TDA4VM
    • Correct Posture Detection and Enforcement - Texas Instruments TDA4VM
    • Visual Anomaly Detection with FOMO-AD - Texas Instruments TDA4VM
    • Surface Crack Detection and Localization - Texas Instruments TDA4VM
    • Surface Crack Detection - Seeed reTerminal
    • Retail Image Classification - Nvidia Jetson Nano
    • SiLabs xG24 Plus Arducam - Sorting Objects with Computer Vision and Robotics - Part 1
    • SiLabs xG24 Plus Arducam - Sorting Objects with Computer Vision and Robotics - Part 2
    • Object Detection and Visualization - Seeed Grove Vision AI Module
    • Bike Rearview Radar - Raspberry Pi
    • Build a Self-Driving RC Vehicle - Arduino Portenta H7 and Computer Vision
    • "Bring Your Own Model" Image Classifier for Wound Identification
    • Acute Lymphoblastic Leukemia Classifier - Nvidia Jetson Nano
    • Hardhat Detection in Industrial Settings - Alif Ensemble E7
    • Motorcycle Helmet Identification and Traffic Light Control - Texas Instruments AM62A
    • Import a Pretrained Model with "Bring Your Own Model" - Texas Instruments AM62A
    • Product Inspection with Visual Anomaly Detection - FOMO-AD - Sony Spresense
    • Visual Anomaly Detection in Fabric using FOMO-AD - Raspberry Pi 5
    • Car Detection and Tracking System for Toll Plazas - Raspberry Pi AI Kit
    • Visual Anomaly Detection - Seeed Grove Vision AI Module V2
    • Object Counting with FOMO - OpenMV Cam RT1062
    • Visitor Heatmap with FOMO Object Detection - Jetson Orin Nano
    • Vehicle Security Camera - Arduino Portenta H7
  • Audio Projects
    • Occupancy Sensing - SiLabs xG24
    • Smart Appliance Control Using Voice Commands - Nordic Thingy:53
    • Glass Window Break Detection - Nordic Thingy:53
    • Illegal Logging Detection - Nordic Thingy:53
    • Illegal Logging Detection - Syntiant TinyML
    • Wearable Cough Sensor and Monitoring - Arduino Nano 33 BLE Sense
    • Collect Data for Keyword Spotting - Raspberry Pi Pico
    • Voice-Activated LED Strip - Raspberry Pi Pico
    • Snoring Detection on a Smart Phone
    • Gunshot Audio Classification - Arduino Nano 33 + Portenta H7
    • AI-Powered Patient Assistance - Arduino Nano 33 BLE Sense
    • Acoustic Pipe Leakage Detection - Arduino Portenta H7
    • Location Identification using Sound - Syntiant TinyML
    • Environmental Noise Classification - Nordic Thingy:53
    • Running Faucet Detection - Seeed XIAO Sense + Blues Cellular
    • Vandalism Detection via Audio Classification - Arduino Nano 33 BLE Sense
    • Predictive Maintenance Using Audio Classification - Arduino Nano 33 BLE Sense
    • Porting an Audio Project from the SiLabs Thunderboard Sense 2 to xG24
    • Environmental Audio Monitoring Wearable - Syntiant TinyML - Part 1
    • Environmental Audio Monitoring Wearable - Syntiant TinyML - Part 2
    • Keyword Spotting - Nordic Thingy:53
    • Detecting Worker Accidents with Audio Classification - Syntiant TinyML
    • Snoring Detection with Syntiant NDP120 Neural Decision Processor - Arduino Nicla Voice
    • Recognize Voice Commands with the Particle Photon 2
    • Voice Controlled Power Plug with Syntiant NDP120 (Nicla Voice)
    • Determining Compressor State with Audio Classification - Avnet RaSynBoard
    • Developing a Voice-Activated Product with Edge Impulse's Synthetic Data Pipeline
    • Enhancing Worker Safety using Synthetic Audio to Create a Dog Bark Classifier
  • Predictive Maintenance and Defect Detection Projects
    • Predictive Maintenance - Nordic Thingy:91
    • Brushless DC Motor Anomaly Detection
    • Industrial Compressor Predictive Maintenance - Nordic Thingy:53
    • Anticipate Power Outages with Machine Learning - Arduino Nano 33 BLE Sense
    • Faulty Lithium-Ion Cell Identification in Battery Packs - Seeed Wio Terminal
    • Weight Scale Predictive Maintenance - Arduino Nano 33 BLE Sense
    • Fluid Leak Detection With a Flowmeter and AI - Seeed Wio Terminal
    • Pipeline Clog Detection with a Flowmeter and AI - Seeed Wio Terminal
    • Refrigerator Predictive Maintenance - Arduino Nano 33 BLE Sense
    • Motor Pump Predictive Maintenance - Infineon PSoC 6 WiFi-BT Pioneer Kit + CN0549
    • BrickML Demo Project - 3D Printer Anomaly Detection
    • Condition Monitoring - Syntiant TinyML Board
    • Predictive Maintenance - Commercial Printer - Sony Spresense + CommonSense
    • Vibration Classification with BrainChip's Akida
    • AI-driven Audio and Thermal HVAC Monitoring - SeeedStudio XIAO ESP32
  • Accelerometer and Activity Projects
    • Arduino x K-Way - Outdoor Activity Tracker
    • Arduino x K-Way - Gesture Recognition for Hiking
    • Arduino x K-Way - TinyML Fall Detection
    • Posture Detection for Worker Safety - SiLabs Thunderboard Sense 2
    • Hand Gesture Recognition - OpenMV Cam H7
    • Arduin-Row, a TinyML Rowing Machine Coach - Arduino Nicla Sense ME
    • Fall Detection using a Transformer Model – Arduino Giga R1 WiFi
    • Bluetooth Fall Detection - Arduino Nano 33 BLE Sense
    • Monitor Packages During Transit with AI - Arduino Nano 33 BLE Sense
    • Smart Baby Swing - Arduino Portenta H7
    • Warehouse Shipment Monitoring - SiLabs Thunderboard Sense 2
    • Gesture Recognition - Bangle.js Smartwatch
    • Gesture Recognition for Patient Communication - SiLabs Thunderboard Sense 2
    • Hospital Bed Occupancy Detection - Arduino Nano 33 BLE Sense
    • Porting a Posture Detection Project from the SiLabs Thunderboard Sense 2 to xG24
    • Porting a Gesture Recognition Project from the SiLabs Thunderboard Sense 2 to xG24
    • Continuous Gait Monitor (Anomaly Detection) - Nordic Thingy:53
    • Classifying Exercise Activities on a BangleJS Smartwatch
  • Air Quality and Environmental Projects
    • Arduino x K-Way - Environmental Asthma Risk Assessment
    • Gas Detection in the Oil and Gas Industry - Nordic Thingy:91
    • Smart HVAC System with a Sony Spresense
    • Smart HVAC System with an Arduino Nicla Vision
    • Indoor CO2 Level Estimation - Arduino Portenta H7
    • Harmful Gases Detection - Arduino Nano 33 BLE Sense
    • Fire Detection Using Sensor Fusion and TinyML - Arduino Nano 33 BLE Sense
    • AI-Assisted Monitoring of Dairy Manufacturing Conditions - Seeed XIAO ESP32C3
    • AI-Assisted Air Quality Monitoring - DFRobot Firebeetle ESP32
    • Air Quality Monitoring with Sipeed Longan Nano - RISC-V Gigadevice
    • Methane Monitoring in Mines - Silabs xG24 Dev Kit
    • Smart Building Ventilation with Environmental Sensor Fusion
    • Sensor Data Fusion with Spresense and CommonSense
    • Water Pollution Detection - Arduino Nano ESP32 + Ultrasonic Scan
    • Fire Detection Using Sensor Fusion - Arduino Nano 33 BLE Sense
  • Novel Sensor Projects
    • 8x8 ToF Gesture Classification - Arduino RP2040 Connect
    • Food Irradiation Dose Detection - DFRobot Beetle ESP32C3
    • Applying EEG Data to Machine Learning, Part 1
    • Applying EEG Data to Machine Learning, Part 2
    • Applying EEG Data to Machine Learning, Part 3
    • Liquid Classification with TinyML - Seeed Wio Terminal + TDS Sensor
    • AI-Assisted Pipeline Diagnostics and Inspection with mmWave Radar
    • Soil Quality Detection Using AI and LoRaWAN - Seeed Sensecap A1101
    • Smart Diaper Prototype - Arduino Nicla Sense ME
    • DIY Smart Glove with Flex Sensors
    • EdgeML Energy Monitoring - Particle Photon 2
    • Wearable for Monitoring Worker Stress using HR/HRV DSP Block - Arduino Portenta
  • Software Integration Demos
    • Azure Machine Learning with Kubernetes Compute and Edge Impulse
    • ROS2 + Edge Impulse, Part 1: Pub/Sub Node in Python
    • ROS2 + Edge Impulse, Part 2: MicroROS
    • Using Hugging Face Datasets in Edge Impulse
    • Using Hugging Face Image Classification Datasets with Edge Impulse
    • Edge Impulse API Usage Sample Application - Jetson Nano Trainer
    • MLOps with Edge Impulse and Azure IoT Edge
    • A Federated Approach to Train and Deploy Machine Learning Models
    • DIY Model Weight Update for Continuous AI Deployments
    • Automate the CI/CD Pipeline of your Models with Edge Impulse and GitHub Actions
    • Deploying Edge Impulse Models on ZEDEDA Cloud Devices
Powered by GitBook
On this page
  • Introduction
  • Use-case explanation
  • Components and Hardware/Software Configuration
  • Components Needed
  • Hardware and Software Configuration
  • Data Collection Process
  • Building and Training the Model
  • 1. Download the Activity Files
  • 2. Create an Edge Impulse Project
  • 3. Upload the Activity Files
  • 4. Creating, Training, and Testing the Model in Edge Impulse
  • Model Deployment
  • Do this in Edge Impulse:
  • Do this in the Espruino IDE:
  • Test the Model on the Watch!
  • Download Exercise Data from the Bangle App
  • Results
  • Improvement Suggestions
  • Conclusion

Was this helpful?

Edit on GitHub
Export as PDF
  1. Accelerometer and Activity Projects

Classifying Exercise Activities on a BangleJS Smartwatch

Build a customized exercise classification ML model and deploy it to the Bangle.js smartwatch.

PreviousContinuous Gait Monitor (Anomaly Detection) - Nordic Thingy:53NextAir Quality and Environmental Projects

Last updated 4 months ago

Was this helpful?

Created By: Thomas Vikström

Public Project Link:

Introduction

This tutorial demonstrates how to use Edge Impulse to classify various exercises or daily activities. This project uses the Bangle.js 2 programmable smartwatch, but any programmable smartwatch with an accelerometer and TensorFlow Lite support can be used if you replicate the Bangle app accordingly.

Use-case explanation

While some of us exercise intuitively without technology, others, like me, find motivation in tracking statistics and trends using devices like smartwatches or smart rings. Any decent smartwatch today uses GPS, accelerometer, gyro meter and other sensors to collect different types of data and consolidates this data into activity summaries. They perform exceptionally well with correctly classifying long repetitive exercises like running and walking outdoors or on a treadmill, skiing, cycling, rowing, etc. Where several of them still have room for improvement though, is correctly classifying the many different activities you might be performing in a gym. I've been a Garmin fan for over 10 years, and is is currently on my fourth Garmin sport watch, but none of them can consistently correctly classify gym activities, even if all of the watches are considered premium sport watches.

Last winter I started working out with my three-decades-old Kettler Sport Variant home gym, and it bothered me that I needed to constantly switch activities on my Garmin watch when I changed from one activity to another, especially as I like to avoid longer monotonous repetitions. This made me wonder if I come up with my own solution with machine learning. Since I have experience with using Tensorflow Lite on the affordable Bangle.js smartwatch, I thought I could at least try.

The result is an app where you first collect exercise training data for export to Edge Impulse, and after uploading a trained ML-model just click on Exercise to let the watch classify all different exercises you're performing and the length of them. Afterwards you can upload the collected data to e.g. Excel for further analysis or storage.

Components and Hardware/Software Configuration

Components Needed

  • A computer supporting Bluetooth Low Energy (BLE). More or less any computer manufactured the last decade is equipped with BLE, but there are also BLE-adapters with USB-connector for older computers.

  • Depending on the activities you plan to do, you might need shoes for walking/running outdoors, gym equipment, kettle bells etc. If you go to a gym, they probably have all the exercise equipment you need.

Hardware and Software Configuration

Hardware Configuration

Software Configuration

  • Click on the RAM-button to upload the program to the watch. Bangle has both volatile RAM-memory as well as flash-memory for long-term storage. RAM content disappears after power-down, while content in flash remains. When testing and developing, it is safer to just upload to RAM as possible serious program crashes won't mess up the watch that much as if you save to flash. That said, it is close to impossible to completely brick the watch with a buggy program, a factory reset should help in almost all cases.

  • You'll be presented with a simple menu with three options:

    • Collect Data - collect data for different activities

    • Inference - run inference to test the current ML-model without storing any further data

    • Exercise - run inference and also collect what activities were performed and the length of them into a CSV-file

Data Collection Process

  • Strap the watch to your non-dominant hand.

  • Select Collect Data on the watch, and select one of the predefined activities, scroll down to see more.

  • When you're ready to do the activity, select Start Recording.

  • Start performing the activity, e.g. walking.

    • Don't change from one activity to another while you are collecting data.

    • There's however no need to try to perform the activity as a robot, just do it naturally. E.g., when walking, just walk as you normally do, take a few turns every now and then, and vary the speed a bit.

    • For the first time, collect a minute or so of data for each activity.

    • Stop recording by quickly pressing the physical button. This will take you back to the main menu.

  • Repeat the above for each activity.

  • Also collect data for a "non-activity", like sitting.

Building and Training the Model

Building and training a ML-model in this project consist of following major steps:

  1. Downloading the recorded activity files to your computer

  2. Creating an Edge Impulse project

  3. Uploading the activity files to Edge Impulse

  4. Creating and training the model

1. Download the Activity Files

  • In the Espruino IDE, click on the Storage icon

  • Your activity files start with ´acti_...` followed by a timestamp when the file was created.

  • Click the Save icon for each activity file individually.

  • Files will be stored in your Downloads folder on your computer

  • Files will be in CSV-format, but have an (StorageFile)-appendix that you'll need to remove. So, rename acti_2024_10_29_20_03.csv (StorageFile) to acti_2024_10_29_20_03.csv.

2. Create an Edge Impulse Project

3. Upload the Activity Files

This consists of two steps:

  • 3.1 Configure the CSV-Wizard

  • 3.2 Uploading the activity files themselves

3.1 Configure the CSV-Wizard

For this project I've used the following steps:

  • Click on Data acquisition

  • Click on CSV-Wizard

  • Click on Choose File, select any of your activity files, and click Upload file

  • Check that you have the following columns:

    • Row #, timestamp, x, y, z, activity

    • Click Looks good, next

  • Fill in the screen like this:

    • Is this time-series data?: Yes, this is time-series data...

    • How is your time-series data formatted?: Each row contains a reading, and sensor values are columns.

    • Do you have a timestamp or time elapsed column?: Yes, it's <timestamp>

    • What type of data is in your timestamp column?: Time elapsed in milliseconds

    • Override timestamp difference?: 80 ms (the default accelerometer is 12.5 Hz which means one sample is 80 ms in length)

  • Click Great, let's look at your values

  • Now you are in Step 4, fill in it like this:

    • Do you have a column that contains the label (the value you want to predict)? Yes, it's <activity>

    • Which columns contain your values? <x, y, z>

  • Click Next, split up into samples

  • In this Step 5, use these settings:

    • How long do you want your samples to be? Limit to <3040> ms

    • How should we deal with multiple labels in a sample? Use the last value of "activity" as the label for each sample...

  • Click Finish wizard

3.2 Upload Activity Files Using the CSV-Wizard

4. Creating, Training, and Testing the Model in Edge Impulse

This consists of a few steps, all done within Edge Impulse:

  • 4.1 Create an impulse

  • 4.2 Generate features

  • 4.3 Train the model

  • 4.4 Test the model

4.1 Create an Impulse

In Edge Impulse, navigate to Create impulse and configure the following settings:

Time-series data:

  • Window size 1,000 ms.

  • Window increase 500 ms.

  • Frequency (Hz) 12.5

  • Zero-pad data [x]

Select Raw data as Processing block and Classification as Learning block. As Bangle isn't one of the officially supported devices, the other available blocks for accelerometer data would need to be developed in Espruino to work identically as the Edge Impulse ones. I actually tried to replicate the Spectral Analysis processing block, but was not successful.

4.2 Generate Features

  • Select Raw data from the menu.

  • As we're using default settings:

    • click Save parameters

    • in next screen, click Generate features

  • After a couple of minutes you'll see how well the activities can be separated in this stage. As you see I have quite a good separation, only sitting is scattered around, the reason being that I by purpose was not sitting completely still all the time, I did a few different random hand movements.

4.3 Train the Model

  • Select Classifier from the menu.

  • You are recommended to try different settings, but to start with why not use mine:

    • Number of training cycles 700

    • 1st dense layer 30 neurons

    • 2nd dense layer 15 neurons

    • 3rd dense layer 8 neurons

  • Click Save & train to start training

  • My results were fairly good with an accuracy of 95%, but to improve the model even further, I'd collect more data.

4.4 Test the Model

This section is testing the ML model on some test data which was automatically placed aside and not used during the training. The objective is to see how well the model performs with data it has not encountered before, thus simulating use in real-life situations.

  • Select Model testing from the menu.

  • Click Classify all

  • After a while you'll get the results.

  • My results were not as good as in the training, but this is quite often the case. The final verdict of model performance is when you try it in the field, or as in this case, on your hand!

Model Deployment

As mentioned, Bangle is not officially supported by Edge Impulse, so you can't use the Deployment menu option. This doesn't of course stop you as the only thing that really matters is that both Bangle and Edge Impulse support Tensorflow Lite.

Do this in Edge Impulse:

  • Select Dashboard from the menu.

  • Click on the Save icon next to Tensorflow Lite (float32)

    • This downloads a model file to your computer's downloads folder.

Do this in the Espruino IDE:

  • Click on the Storage icon

  • Click Upload files

  • Find the model you downloaded from Edge Impulse

  • Type impulse1 as file name and click Ok.

    • FYI: When the Bangle-program is run it will convert the Tensorflow Lite file to base64-format and create an internal file named impulse4

Test the Model on the Watch!

Now it's time to test the model in real life!

Here you'll use the same program as earlier in Software Configuration, so unless the app is still running on your Bangle, just upload it to RAM once again.

  • When you want to test the app without storing any activity data, just select Inference.

  • This shows the current activity you are doing, in my case unfortunately sitting...

  • Click wherever on the display to go back.

  • Start exercising (yay!) by selecting Exercise.

  • This runs inference as the previous one, but also collects data to a CSV-file.

  • In this app there are no bells and whistles shown on the watch display, feel free to learn Javascript to enhance the Bangle app!

Download Exercise Data from the Bangle App

Access the Storage again from within Espruino IDE, and download the exercise files (named exercise + timestamp when file created)

The file contains timestamps when the inferred activity started and ended as well as the activity itself. As you see below, it classified several activities I did. I also simulated rowing, but as I could not use the real rowing machine, it registered it as sitting.

The program registers an activity only after detecting the same activity for at least 10 seconds. In a real scenario this should probably be increased to 30 seconds or more as you are probably not all the time very quickly switching between different activities.

Results

The results from the Edge Impulse part completely met the objectives and expectations I had, i.e. to be able to classify which exercise was performed. The training result of 95% can be considered good, especially considering raw data is used. I also tested the spectral features in Edge Impulse, and was not surprised to find that it consistently gave better results with different settings.

As for the Bangle app, the results only partially met my objectives. While the watch itself is excellent for its price point, I only have basic Javascript skills and was not completely successful in getting the exercise registering logic as good as I'd wanted. Testing the app during exercises is also in practice somewhat challenging as you can't easily watch the inferred activity while performing it!

Improvement Suggestions

As a summary, the concept as such is working, and there's a prototype of an exercise app that can be improved upon. As my gym season begins during the dark winter months, I plan to collect extensive data to further improve the model.

Conclusion

This tutorial aimed to build a machine learning model and a corresponding Bangle app capable of accurately classifying activities, particularly those performed in a gym, where users frequently switch between different exercises. As has been demonstrated, this goal was largely met.

Extensive evidence suggests that lifelong exercise extends health span and delays the onset of The health benefits of doing regular exercise have been shown in many studies. For instance, highlights how physical activity reduces stress and anxiety, boosts mood, improves self-confidence, sharpens memory, and strengthens muscles and bones. It also helps in preventing and reducing heart disease, obesity, blood sugar fluctuations, cardiovascular diseases and cancer.

A programmable smartwatch supporting Tensorflow Lite, in this case I used which is slightly under €100 (with taxes).

There's practically nothing to configure hardware wise! Although Bangle isn't in the same premium class as Apple or Samsung watches, the initial experience is quite similar—everything usually works seamlessly out of the box, making it easy to get started. Do read the though.

By following the you'll learn how to develop apps on the smartwatch.

For this tutorial, it's enough to open the , connect to your Bangle according to the instructions found in above guide, and paste to the right side in the IDE.

Head over to Edge Impulse, and create a new project. If you are completely new to the platform, check out their .

Here you'll use one of your activity files as an example model to let the wizard know how the file is structured. This configuration only needs to be done once. Documentation for the CSV-Wizard is found .

Now you can upload all activity files, including the one you used to configure the CSV-Wizard, to Edge Impulse. See the for detailed steps.

One thing that might improve the accuracy somewhat is to change the accelerometer frequency from 12.5 Hz to e.g. 100 Hz, I've tested that this is possible. With help of ChatGPT I also tried to replicate the spectral features in Edge Impulse, but finally needed to leave it out of the scope. The app already outperforms my Garmin watch in switching from one activity to another without me taking any actions on the watch. Another, quite straightforward improvement, is to make the app a real Bangle app, this can be done by following the steps in the paragraph Making an App . New features can of course be added to the app, heart rate data, steps, altitude etc.

All the code and files used in this write-up are found from , the public Edge Impulse project is . Feel free to clone the project for your own use case.

40 chronic conditions and diseases.
one study
Bangle.js 2
Getting Started guide
getting started steps
Espruino IDE
this program
getting started tutorial
here
documentation
here
Github
here
https://studio.edgeimpulse.com/public/544247/latest