LogoLogo
HomeDocsAPI & SDKsProjectsForumStudio
  • Welcome
    • Featured Machine Learning Projects
      • Getting Started with the Edge Impulse Nvidia TAO Pipeline - Renesas EK-RA8D1
      • Smart City Traffic Analysis - NVIDIA TAO + Jetson Orin Nano
      • ROS 2 Pick and Place System - Arduino Braccio++ Robotic Arm and Luxonis OAK-D
      • Optimize a cloud-based Visual Anomaly Detection Model for Edge Deployments
      • Rooftop Ice Detection with Things Network Visualization - Nvidia Omniverse Replicator
      • Surgery Inventory Object Detection - Synthetic Data - Nvidia Omniverse Replicator
      • NVIDIA Omniverse - Synthetic Data Generation For Edge Impulse Projects
      • Community Guide – Using Edge Impulse with Nvidia DeepStream
      • Computer Vision Object Counting - Avnet RZBoard V2L
      • Gesture Appliances Control with Pose Detection - BrainChip AKD1000
      • Counting for Inspection and Quality Control - Nvidia Jetson Nano (TensorRT)
      • High-resolution, High-speed Object Counting - Nvidia Jetson Nano (TensorRT)
    • Prototype and Concept Projects
      • Renesas CK-RA6M5 Cloud Kit - Getting Started with Machine Learning
      • TI CC1352P Launchpad - Getting Started with Machine Learning
      • OpenMV Cam RT1062 - Getting Started with Machine Learning
      • Getting Started with Edge Impulse Experiments
  • Computer Vision Projects
    • Workplace Organizer - Nvidia Jetson Nano
    • Recyclable Materials Sorter - Nvidia Jetson Nano
    • Analog Meter Reading - Arduino Nicla Vision
    • Creating Synthetic Data with Nvidia Omniverse Replicator
    • SonicSight AR - Sound Classification with Feedback on an Augmented Reality Display
    • Traffic Monitoring - Brainchip Akida
    • Multi-camera Video Stream Inference - Brainchip Akida
    • Industrial Inspection Line - Brainchip Akida
    • X-Ray Classification and Analysis - Brainchip Akida
    • Inventory Stock Tracker - FOMO - BrainChip Akida
    • Container Counting - Arduino Nicla Vision
    • Smart Smoke Alarm - Arduino Nano 33
    • Shield Bot Autonomous Security Robot
    • Cyclist Blind Spot Detection - Himax WE-I Plus
    • IV Drip Fluid-Level Monitoring - Arduino Portenta H7
    • Worker PPE Safety Monitoring - Nvidia Jetson Nano
    • Delivered Package Detection - ESP-EYE
    • Bean Leaf Disease Classification - Sony Spresense
    • Oil Tank Measurement Using Computer Vision - Sony Spresense
    • Object Counting for Smart Industries - Raspberry Pi
    • Smart Cashier with FOMO - Raspberry Pi
    • PCB Defect Detection with Computer Vision - Raspberry Pi
    • Bicycle Counting - Sony Spresense
    • Counting Eggs with Computer Vision - OpenMV Cam H7
    • Elevator Passenger Counting - Arduino Nicla Vision
    • ESD Protection using Computer Vision - Seeed ReComputer
    • Solar Panel Defect Detection - Arduino Portenta H7
    • Label Defect Detection - Raspberry Pi
    • Dials and Knob Monitoring with Computer Vision - Raspberry Pi
    • Digital Character Recognition on Electric Meter System - OpenMV Cam H7
    • Corrosion Detection with Computer Vision - Seeed reTerminal
    • Inventory Management with Computer Vision - Raspberry Pi
    • Monitoring Retail Checkout Lines with Computer Vision - Renesas RZ/V2L
    • Counting Retail Inventory with Computer Vision - Renesas RZ/V2L
    • Pose Detection - Renesas RZ/V2L
    • Product Quality Inspection - Renesas RZ/V2L
    • Smart Grocery Cart Using Computer Vision - OpenMV Cam H7
    • Driver Drowsiness Detection With FOMO - Arduino Nicla Vision
    • Gastroscopic Image Processing - OpenMV Cam H7
    • Pharmaceutical Pill Quality Control and Defect Detection
    • Deter Shoplifting with Computer Vision - Texas Instruments TDA4VM
    • Smart Factory Prototype - Texas Instruments TDA4VM
    • Correct Posture Detection and Enforcement - Texas Instruments TDA4VM
    • Visual Anomaly Detection with FOMO-AD - Texas Instruments TDA4VM
    • Surface Crack Detection and Localization - Texas Instruments TDA4VM
    • Surface Crack Detection - Seeed reTerminal
    • Retail Image Classification - Nvidia Jetson Nano
    • SiLabs xG24 Plus Arducam - Sorting Objects with Computer Vision and Robotics - Part 1
    • SiLabs xG24 Plus Arducam - Sorting Objects with Computer Vision and Robotics - Part 2
    • Object Detection and Visualization - Seeed Grove Vision AI Module
    • Bike Rearview Radar - Raspberry Pi
    • Build a Self-Driving RC Vehicle - Arduino Portenta H7 and Computer Vision
    • "Bring Your Own Model" Image Classifier for Wound Identification
    • Acute Lymphoblastic Leukemia Classifier - Nvidia Jetson Nano
    • Hardhat Detection in Industrial Settings - Alif Ensemble E7
    • Motorcycle Helmet Identification and Traffic Light Control - Texas Instruments AM62A
    • Import a Pretrained Model with "Bring Your Own Model" - Texas Instruments AM62A
    • Product Inspection with Visual Anomaly Detection - FOMO-AD - Sony Spresense
    • Visual Anomaly Detection in Fabric using FOMO-AD - Raspberry Pi 5
    • Car Detection and Tracking System for Toll Plazas - Raspberry Pi AI Kit
    • Visual Anomaly Detection - Seeed Grove Vision AI Module V2
    • Object Counting with FOMO - OpenMV Cam RT1062
    • Visitor Heatmap with FOMO Object Detection - Jetson Orin Nano
    • Vehicle Security Camera - Arduino Portenta H7
  • Audio Projects
    • Occupancy Sensing - SiLabs xG24
    • Smart Appliance Control Using Voice Commands - Nordic Thingy:53
    • Glass Window Break Detection - Nordic Thingy:53
    • Illegal Logging Detection - Nordic Thingy:53
    • Illegal Logging Detection - Syntiant TinyML
    • Wearable Cough Sensor and Monitoring - Arduino Nano 33 BLE Sense
    • Collect Data for Keyword Spotting - Raspberry Pi Pico
    • Voice-Activated LED Strip - Raspberry Pi Pico
    • Snoring Detection on a Smart Phone
    • Gunshot Audio Classification - Arduino Nano 33 + Portenta H7
    • AI-Powered Patient Assistance - Arduino Nano 33 BLE Sense
    • Acoustic Pipe Leakage Detection - Arduino Portenta H7
    • Location Identification using Sound - Syntiant TinyML
    • Environmental Noise Classification - Nordic Thingy:53
    • Running Faucet Detection - Seeed XIAO Sense + Blues Cellular
    • Vandalism Detection via Audio Classification - Arduino Nano 33 BLE Sense
    • Predictive Maintenance Using Audio Classification - Arduino Nano 33 BLE Sense
    • Porting an Audio Project from the SiLabs Thunderboard Sense 2 to xG24
    • Environmental Audio Monitoring Wearable - Syntiant TinyML - Part 1
    • Environmental Audio Monitoring Wearable - Syntiant TinyML - Part 2
    • Keyword Spotting - Nordic Thingy:53
    • Detecting Worker Accidents with Audio Classification - Syntiant TinyML
    • Snoring Detection with Syntiant NDP120 Neural Decision Processor - Arduino Nicla Voice
    • Recognize Voice Commands with the Particle Photon 2
    • Voice Controlled Power Plug with Syntiant NDP120 (Nicla Voice)
    • Determining Compressor State with Audio Classification - Avnet RaSynBoard
    • Developing a Voice-Activated Product with Edge Impulse's Synthetic Data Pipeline
    • Enhancing Worker Safety using Synthetic Audio to Create a Dog Bark Classifier
  • Predictive Maintenance and Defect Detection Projects
    • Predictive Maintenance - Nordic Thingy:91
    • Brushless DC Motor Anomaly Detection
    • Industrial Compressor Predictive Maintenance - Nordic Thingy:53
    • Anticipate Power Outages with Machine Learning - Arduino Nano 33 BLE Sense
    • Faulty Lithium-Ion Cell Identification in Battery Packs - Seeed Wio Terminal
    • Weight Scale Predictive Maintenance - Arduino Nano 33 BLE Sense
    • Fluid Leak Detection With a Flowmeter and AI - Seeed Wio Terminal
    • Pipeline Clog Detection with a Flowmeter and AI - Seeed Wio Terminal
    • Refrigerator Predictive Maintenance - Arduino Nano 33 BLE Sense
    • Motor Pump Predictive Maintenance - Infineon PSoC 6 WiFi-BT Pioneer Kit + CN0549
    • BrickML Demo Project - 3D Printer Anomaly Detection
    • Condition Monitoring - Syntiant TinyML Board
    • Predictive Maintenance - Commercial Printer - Sony Spresense + CommonSense
    • Vibration Classification with BrainChip's Akida
    • AI-driven Audio and Thermal HVAC Monitoring - SeeedStudio XIAO ESP32
  • Accelerometer and Activity Projects
    • Arduino x K-Way - Outdoor Activity Tracker
    • Arduino x K-Way - Gesture Recognition for Hiking
    • Arduino x K-Way - TinyML Fall Detection
    • Posture Detection for Worker Safety - SiLabs Thunderboard Sense 2
    • Hand Gesture Recognition - OpenMV Cam H7
    • Arduin-Row, a TinyML Rowing Machine Coach - Arduino Nicla Sense ME
    • Fall Detection using a Transformer Model – Arduino Giga R1 WiFi
    • Bluetooth Fall Detection - Arduino Nano 33 BLE Sense
    • Monitor Packages During Transit with AI - Arduino Nano 33 BLE Sense
    • Smart Baby Swing - Arduino Portenta H7
    • Warehouse Shipment Monitoring - SiLabs Thunderboard Sense 2
    • Gesture Recognition - Bangle.js Smartwatch
    • Gesture Recognition for Patient Communication - SiLabs Thunderboard Sense 2
    • Hospital Bed Occupancy Detection - Arduino Nano 33 BLE Sense
    • Porting a Posture Detection Project from the SiLabs Thunderboard Sense 2 to xG24
    • Porting a Gesture Recognition Project from the SiLabs Thunderboard Sense 2 to xG24
    • Continuous Gait Monitor (Anomaly Detection) - Nordic Thingy:53
    • Classifying Exercise Activities on a BangleJS Smartwatch
  • Air Quality and Environmental Projects
    • Arduino x K-Way - Environmental Asthma Risk Assessment
    • Gas Detection in the Oil and Gas Industry - Nordic Thingy:91
    • Smart HVAC System with a Sony Spresense
    • Smart HVAC System with an Arduino Nicla Vision
    • Indoor CO2 Level Estimation - Arduino Portenta H7
    • Harmful Gases Detection - Arduino Nano 33 BLE Sense
    • Fire Detection Using Sensor Fusion and TinyML - Arduino Nano 33 BLE Sense
    • AI-Assisted Monitoring of Dairy Manufacturing Conditions - Seeed XIAO ESP32C3
    • AI-Assisted Air Quality Monitoring - DFRobot Firebeetle ESP32
    • Air Quality Monitoring with Sipeed Longan Nano - RISC-V Gigadevice
    • Methane Monitoring in Mines - Silabs xG24 Dev Kit
    • Smart Building Ventilation with Environmental Sensor Fusion
    • Sensor Data Fusion with Spresense and CommonSense
    • Water Pollution Detection - Arduino Nano ESP32 + Ultrasonic Scan
    • Fire Detection Using Sensor Fusion - Arduino Nano 33 BLE Sense
  • Novel Sensor Projects
    • 8x8 ToF Gesture Classification - Arduino RP2040 Connect
    • Food Irradiation Dose Detection - DFRobot Beetle ESP32C3
    • Applying EEG Data to Machine Learning, Part 1
    • Applying EEG Data to Machine Learning, Part 2
    • Applying EEG Data to Machine Learning, Part 3
    • Liquid Classification with TinyML - Seeed Wio Terminal + TDS Sensor
    • AI-Assisted Pipeline Diagnostics and Inspection with mmWave Radar
    • Soil Quality Detection Using AI and LoRaWAN - Seeed Sensecap A1101
    • Smart Diaper Prototype - Arduino Nicla Sense ME
    • DIY Smart Glove with Flex Sensors
    • EdgeML Energy Monitoring - Particle Photon 2
    • Wearable for Monitoring Worker Stress using HR/HRV DSP Block - Arduino Portenta
  • Software Integration Demos
    • Azure Machine Learning with Kubernetes Compute and Edge Impulse
    • ROS2 + Edge Impulse, Part 1: Pub/Sub Node in Python
    • ROS2 + Edge Impulse, Part 2: MicroROS
    • Using Hugging Face Datasets in Edge Impulse
    • Using Hugging Face Image Classification Datasets with Edge Impulse
    • Edge Impulse API Usage Sample Application - Jetson Nano Trainer
    • MLOps with Edge Impulse and Azure IoT Edge
    • A Federated Approach to Train and Deploy Machine Learning Models
    • DIY Model Weight Update for Continuous AI Deployments
    • Automate the CI/CD Pipeline of your Models with Edge Impulse and GitHub Actions
    • Deploying Edge Impulse Models on ZEDEDA Cloud Devices
Powered by GitBook
On this page
  • Intro
  • The Rationale
  • Predictive Maintenance
  • Hardware Requirements and Settings
  • Collect Training Data
  • Code
  • Data Collection
  • Impulse Design
  • Firmware Deployment
  • Real-world Deployment Example
  • Conclusion

Was this helpful?

Edit on GitHub
Export as PDF
  1. Predictive Maintenance and Defect Detection Projects

Weight Scale Predictive Maintenance - Arduino Nano 33 BLE Sense

With some physics and a TinyML model, add weight prediction to a pallet-wrapping machine.

PreviousFaulty Lithium-Ion Cell Identification in Battery Packs - Seeed Wio TerminalNextFluid Leak Detection With a Flowmeter and AI - Seeed Wio Terminal

Last updated 1 year ago

Was this helpful?

Created By: Simone Salerno

Public Project Link:

Intro

In industrial settings, many factories need to handle pallets. It is a storage format that spans almost all sectors.

Pallet Example

To speed up the packaging process, there is a machine that is devoted to wrapping the pallet contents into a plastic film to keep the contents tight and secured.

That's the sole purpose of this machine in the factory or production facility. But with the help of machine learning, we can upgrade these existing dumb-machines to add a new feature: weighing the pallets.

It may not appear that obvious, but we don't need a weight/pressure sensor to do this. Nor do we need to modify the circuitry or retrofit the machine.

Instead, we can use a "plug-in", external device that only consists of an accelerometer and a microcontroller.

And as we'll see shortly, this external device can even add predictive maintenance capabilities to the machine by pro-actively identifying malfunctions from the data and patterns collected.

The Rationale

The methodology behind this measurement technique is pretty simple: the pallet machine has a rotating motor at its core that is necessary to wrap the plastic film around the pallet.

During its rotation, the motor is subject to a friction that is proportionate to the weight on the platform. We can capture slight variations in the rotation pattern by means of an IMU.

We'll then use the accelerometer and gyroscope data as a proxy for the friction on the motor. By modelling this relation through machine learning, we aim to be able to predict the weight based on the IMU readings.

This will work wonderfully, because the machine always applies the same rotation force to the motor: if a large weight is on the platform, it will rotate slower than if the platform had no weight upon it.

Predictive Maintenance

Once we've modelled the relation between IMU data and weight, we can use it another way too: if we know the true weight of the pallet that's on the platform, we can compare it with the predicted weight, and look for discrepancies.

If they do not match by a large amount, it means that something is not working as usual. If the predicted weight is much higher than the actual one, it may mean that the motor is subject to more friction than it should be and that friction is not due to the pallet itself. Perhaps it needs to be oiled to work more smoothly, or some other issue is causing added strain on the motor.

Hardware Requirements and Settings

The requirements are pretty simple: on the hardware side you only need an IMU and a microcontroller (or a board with an integrated IMU, such as the Arduino Nano BLE Sense).

To avoid using cables that may interfere with the operation of the machine, it is advisable to choose a board that has either WiFi or Bluetooth radio, so you can stream data to your PC wirelessly.

The setup is simple too: assemble your board with a battery in a plastic box, and anchor it on the rotating platform, near the edge of the rotating platter (at the border, linear velocity is greater than in the center, so the IMU can pick-up pattern variations more easily).

This project is articulated in 3 steps:

  1. Collect training data

  2. Design the Impulse

  3. Deploy the model and use it

Collect Training Data

The first step is to collect training data for our model.

Code

If using the Arduino Nano BLE Sense (or similar board with integrated IMU and BLE), you can use the following two code snippets: the first one has to be flashed on the board to enable the BLE data streaming, the second one has to run on your PC to receive the streamed data.

On the Arduino:

#include <Arduino_LSM9DS1.h>
#include <ArduinoBLE.h>


// data structure to hold 3 accelerometer + 3 gyroscope values
union imu_dtype {
  float values[6];
  uint8_t bytes[6 * sizeof(float)];
} imuReading;


BLEService imuService("9f0283a8-ffbb-44c2-87fc-f4133c1d1302");
BLECharacteristic imuCharacteristic("9f0283a8-ffbb-44c2-87fc-f4133c1d1305", BLERead | BLENotify, sizeof(imuReading.bytes));
BLEDevice central;


void setup() {
  Serial.begin(115200);
  delay(3000);
  Serial.println("Started");
  
  while (!IMU.begin()) {
    Serial.println("Failed to initialize IMU!");
    delay(1000);
  }

  while (!BLE.begin()) {
    Serial.println("Failed to initialize BLE!");
    delay(1000);
  }

  Serial.print("Accelerometer sample rate = ");
  Serial.print(IMU.accelerationSampleRate());
  Serial.println(" Hz");

  // configure BLE
  BLE.setDeviceName("Arduino BLE Sense");
  BLE.setLocalName("Arduino BLE Sense");
  BLE.setAdvertisedService(imuService);
  imuService.addCharacteristic(imuCharacteristic);
  BLE.addService(imuService);
  BLE.advertise();
}

void loop() {
  float ax, ay, az;
  float gx, gy, gz;

  if (IMU.accelerationAvailable() && IMU.gyroscopeAvailable()) {
    IMU.readAcceleration(ax, ay, az);
    IMU.readGyroscope(gx, gy, gz);
    
    Serial.print(ax);
    Serial.print('\t');
    Serial.print(ay);
    Serial.print('\t');
    Serial.print(az);
    Serial.print('\t');
    Serial.print(gx);
    Serial.print('\t');
    Serial.print(gy);
    Serial.print('\t');
    Serial.println(gz);
    
    BLE.advertise();

    // try to connect to PC
    if (!central || !central.connected())
      central = BLE.central();

    // if connected, stream data
    if (central && central.connected()) {
      Serial.println("streaming...");
      
      imuReading.values[0] = ax;
      imuReading.values[1] = ay;
      imuReading.values[2] = az;
      imuReading.values[3] = gx;
      imuReading.values[4] = gy;
      imuReading.values[5] = gz;
      
      imuCharacteristic.writeValue(imuReading.bytes, sizeof(imuReading.bytes));
    }
  } 
}

On your PC, you need Python to run the following script that connects to the microcontroller and saves the streamed data to a file:

pip install bleak
import asyncio
import csv
from time import sleep
from struct import unpack
from bleak import BleakScanner, BleakClient


readings = []
collect_time_in_seconds = 30


def on_notify(_, data):
    """
    To be run when new data comes from BLE
    :param _:
    :param data:
    :return:
    """
    global readings

    # packet is made of 6 floats (ax, ay, az, gx, gy, gz)
    parsed_data = unpack('ffffff', data)
    print(parsed_data)
    readings.append(parsed_data)


async def main():
    arduino = None
    imu_characteristic = '9f0283a8-ffbb-44c2-87fc-f4133c1d1305'
    devices = await BleakScanner.discover()

    # find Arduino device
    for device in devices:
        print('Found device', device.name, 'at address', device.address)

        if 'Arduino BLE Sense' in device.name:
            arduino = device
            break

    # no board found, abort
    if arduino is None:
        print('Cannot find Arduino board')
        return

    # connect to BLE characteristic
    async with BleakClient(arduino.address) as client:
        await client.start_notify(imu_characteristic, on_notify)
        print('Started collection...')
        sleep(collect_time_in_seconds)
        await client.stop_notify(imu_characteristic)

    # save to CSV
    filename = input('Which weight is this? ')

    with open('%s.csv' % filename, 'w', encoding='utf-8') as file:
        writer = csv.writer(file)
        writer.writerow(['ax', 'ay', 'az', 'gx', 'gy', 'gz'])
        writer.writerows(readings)


if __name__ == "__main__":
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())

Data Collection

To accurately model the IMU <-> weight relation, you need a few reference weights. How much of them and at what increments, depends on your use case.

For this guide, I collected data at the following weights (in kg):

  • 0

  • 40

  • 80

  • 120

  • 160

  • 200

  • 240

  • 280

  • 320

  • 430

  • 600

  • 1000

At lower weights (until 300 kg), I collected data at 40 kg intervals because I wanted to differentiate at a finer granularity. Then I increased the step to 100, 200 and 400 kg because at higher weights I only wanted to get a rough idea.

Feel free to customize your own scale as you see fit.

Warning: you can't expect to achieve a very fine granularity (eg. 1-5 kg) because the friction variation on the motor would be too small. Aim for 40-50 kg steps at least.

As with all Machine Learning projects, the more data you collect, the better. I collected 30 seconds of data for each weight at a 26 Hz sampling rate. If your IMU supports higher rates (most allows up to 104 Hz), you can go with that and test if it increases your overall accuracy. The longer the time that you collect data, the more robust your model will be.

For each weight on the machine, follow these procedures:

  1. Put the microcontroller board on the platform and turn it on

  2. Put the weight on the platform

  3. Start the machine and let it go for a few seconds (so it reaches its normal speed)

  4. Run the Python script and wait for the data collection to complete

  5. Input a name for the CSV file that will contain data for the given weight

Repeat the process for each weight.

You will end up with a list of CSV files, one for each weight. This is an easy format to import into Edge Impulse.

Impulse Design

Edge Impulse allows for 3 different tasks:

  • Classification

  • Regression

  • Anomaly detection

In our case, we want to model a continuous relation between the input (IMU data) and the output (weight), so it is a 'regression' task.

More specifically, this is a time-series regression task, so we will need to window our data and extract spectral features from it. This is most often the case when working with time series data.

The window duration depends on the working speed of your machine. My advice here is to go with a large duration, because we expect the rotation to not be very fast: if your window is too short, it won't contain much variation in data.

Nevertheless, this is mostly a trial-and-error process. Since Edge Impulse makes it so easy to experiment with different configurations, start with a reasonable value of 3-5 seconds and then tune based on the accuracy feedback.

The model doesn't need to be overly complex: start with a 2-layer fully-connected network and see if it performs well for you. If not, increase the number of layers or neurons.

Firmware Deployment

Once you're satisfied with the results, it is time to deploy the trained Neural Network back to your board.

Once again, we'll use BLE to stream the predicted weight wirelessly to a PC. On the Arduino, run this snippet:

#include <Arduino_LSM9DS1.h>
#include <ArduinoBLE.h>
#include <eloquent.h>

// replace this with the library downloaded from Edge Impulse
#include <palletizer_inferencing.h>
#include <eloquent/tinyml/edgeimpulse.h>

using namespace Eloquent::TinyML::EdgeImpulse;

Impulse impulse;
ImpulseBuffer buffer;

BLEService weightService("9f0283a8-ffbb-44c2-87fc-f4133c1e1302");
BLEFloatCharacteristic weightCharacteristic("9f0283a8-ffbb-44c2-87fc-f4133c1e1305", BLERead | BLENotify);
BLEDevice central;


void  setup() {
  Serial.begin(115200);
  delay(3000);

  while (!IMU.begin()) {
    Serial.println("Failed to initialize IMU!");
    delay(1000);
  }

  while (!BLE.begin()) {
    Serial.println("Failed to initialize BLE!");
    delay(1000);
  }

  // configure BLE
  BLE.setDeviceName("Arduino BLE Sense");
  BLE.setLocalName("Arduino BLE Sense");
  BLE.setAdvertisedService(weightService);
  weightService.addCharacteristic(weightCharacteristic);
  BLE.addService(weightService);
  BLE.advertise();

  Serial.println("Start collecting data...");
}

void loop() {
  float ax, ay, az;
  float gx, gy, gz;

  // read IMU data, if available
  if (!IMU.accelerationAvailable() || !IMU.gyroscopeAvailable())
    return;
    
  IMU.readAcceleration(ax, ay, az);
  IMU.readGyroscope(gx, gy, gz);

  float features[6] = {ax, ay, az, gx, gy, gz};

  if (!buffer.push(features, 6))
    // Queue is not full yet
    return;

  // we are ready to perform inference
  float prediction = impulse.regression(buffer.values);

  Serial.print("Predicted weight: ");
  Serial.println(prediction);

  // stream predicted weight over BLE
  BLE.advertise();

  // try to connect to PC
  if (!central || !central.connected())
    central = BLE.central();

  // if connected, stream data
  if (central && central.connected()) {
    Serial.println("streaming...");
    
    weightCharacteristic.writeValue(prediction);
  }
}

The ImpulseBuffer is a data structure that holds an array where you can push new values. When the buffer is full, it shifts the old elements out to make room for the new ones. This way, you have an "infinite" buffer that mimics the windowing scheme of Edge Impulse.

To perform the prediction over the window of collected data, you only need to call impulse.regression(buffer.values) and use the result as per your project needs.

In this example, we stream the value over BLE. In your own project, you could also use the value to control an actuator or raise an alarm when certain weights are detected.

Real-world Deployment Example

To give you a real-world example on how to use this project, we'll pretend we have an LED display near the stretch-film machine where we want to see in real-time the predicted weight.

Since we're already streaming the data over BLE, we need a receiver device connected to the display. For the sake of the example, we'll use another Arduino BLE Sense.

On this device, run the following snippet:

/**
 * Display weight from BLE on TM1637 display
 * Download library from https://github.com/avishorp/TM1637
 */
#include <ArduinoBLE.h>
#include <TM1637Display.h>


BLEDevice peripheral;
TM1637Display display = TM1637Display(2, 3);

union weight_dtype {
  float weight;
  uint8_t bytes[sizeof(float)];
} weightPayload;


void  setup() {
  Serial.begin(115200);
  delay(3000);

  while (!BLE.begin()) {
    Serial.println("Failed to initialize BLE!");
    delay(1000);
  }
}


void loop() {
  // connect to peripheral
  if (!peripheral) {
    BLE.scanForName("Arduino BLE Sense");
    peripheral = BLE.available();
  }

  if (!peripheral)
    return;

  if (!peripheral.connected())
    peripheral.connect();

  if (!peripheral.connected())
    return;

  BLE.stopScan();
  BLECharacteristic weightCharacteristic = peripheral.characteristic("9f0283a8-ffbb-44c2-87fc-f4133c1e1305");

  // read value
  if (!weightCharacteristic.readValue(weightPayload.bytes, sizeof(weightPayload.bytes)))
    return;
  
  Serial.print("Got: ");
  Serial.println(weightPayload.weight);
  display.showNumberDec((uint16_t) weightPayload.weight);
}

This should then render the predicted weight on the 7-segment display.

Conclusion

This project aims to add machine learning to a traditional industrial machine, making it smarter and also adding predictive maintenance capabilities as well. Using only a microcontroller and an IMU, we were able to add weight estimation for pallets, and can identify when the rotational speed (force) of a motor is inconsistent with predicted values.

Pallet Stretch-film Machine
Impulse Design
DSP Results
Topology

If using the , this part is very straightforward.

TM1637 Display
Eloquent Arduino library
https://studio.edgeimpulse.com/public/136188/latest