LogoLogo
HomeDocsAPI & SDKsProjectsForumStudio
  • Welcome
    • Featured Machine Learning Projects
      • Getting Started with the Edge Impulse Nvidia TAO Pipeline - Renesas EK-RA8D1
      • Smart City Traffic Analysis - NVIDIA TAO + Jetson Orin Nano
      • ROS 2 Pick and Place System - Arduino Braccio++ Robotic Arm and Luxonis OAK-D
      • Optimize a cloud-based Visual Anomaly Detection Model for Edge Deployments
      • Rooftop Ice Detection with Things Network Visualization - Nvidia Omniverse Replicator
      • Surgery Inventory Object Detection - Synthetic Data - Nvidia Omniverse Replicator
      • NVIDIA Omniverse - Synthetic Data Generation For Edge Impulse Projects
      • Community Guide – Using Edge Impulse with Nvidia DeepStream
      • Computer Vision Object Counting - Avnet RZBoard V2L
      • Gesture Appliances Control with Pose Detection - BrainChip AKD1000
      • Counting for Inspection and Quality Control - Nvidia Jetson Nano (TensorRT)
      • High-resolution, High-speed Object Counting - Nvidia Jetson Nano (TensorRT)
    • Prototype and Concept Projects
      • Renesas CK-RA6M5 Cloud Kit - Getting Started with Machine Learning
      • TI CC1352P Launchpad - Getting Started with Machine Learning
      • OpenMV Cam RT1062 - Getting Started with Machine Learning
      • Getting Started with Edge Impulse Experiments
  • Computer Vision Projects
    • Workplace Organizer - Nvidia Jetson Nano
    • Recyclable Materials Sorter - Nvidia Jetson Nano
    • Analog Meter Reading - Arduino Nicla Vision
    • Creating Synthetic Data with Nvidia Omniverse Replicator
    • SonicSight AR - Sound Classification with Feedback on an Augmented Reality Display
    • Traffic Monitoring - Brainchip Akida
    • Multi-camera Video Stream Inference - Brainchip Akida
    • Industrial Inspection Line - Brainchip Akida
    • X-Ray Classification and Analysis - Brainchip Akida
    • Inventory Stock Tracker - FOMO - BrainChip Akida
    • Container Counting - Arduino Nicla Vision
    • Smart Smoke Alarm - Arduino Nano 33
    • Shield Bot Autonomous Security Robot
    • Cyclist Blind Spot Detection - Himax WE-I Plus
    • IV Drip Fluid-Level Monitoring - Arduino Portenta H7
    • Worker PPE Safety Monitoring - Nvidia Jetson Nano
    • Delivered Package Detection - ESP-EYE
    • Bean Leaf Disease Classification - Sony Spresense
    • Oil Tank Measurement Using Computer Vision - Sony Spresense
    • Object Counting for Smart Industries - Raspberry Pi
    • Smart Cashier with FOMO - Raspberry Pi
    • PCB Defect Detection with Computer Vision - Raspberry Pi
    • Bicycle Counting - Sony Spresense
    • Counting Eggs with Computer Vision - OpenMV Cam H7
    • Elevator Passenger Counting - Arduino Nicla Vision
    • ESD Protection using Computer Vision - Seeed ReComputer
    • Solar Panel Defect Detection - Arduino Portenta H7
    • Label Defect Detection - Raspberry Pi
    • Dials and Knob Monitoring with Computer Vision - Raspberry Pi
    • Digital Character Recognition on Electric Meter System - OpenMV Cam H7
    • Corrosion Detection with Computer Vision - Seeed reTerminal
    • Inventory Management with Computer Vision - Raspberry Pi
    • Monitoring Retail Checkout Lines with Computer Vision - Renesas RZ/V2L
    • Counting Retail Inventory with Computer Vision - Renesas RZ/V2L
    • Pose Detection - Renesas RZ/V2L
    • Product Quality Inspection - Renesas RZ/V2L
    • Smart Grocery Cart Using Computer Vision - OpenMV Cam H7
    • Driver Drowsiness Detection With FOMO - Arduino Nicla Vision
    • Gastroscopic Image Processing - OpenMV Cam H7
    • Pharmaceutical Pill Quality Control and Defect Detection
    • Deter Shoplifting with Computer Vision - Texas Instruments TDA4VM
    • Smart Factory Prototype - Texas Instruments TDA4VM
    • Correct Posture Detection and Enforcement - Texas Instruments TDA4VM
    • Visual Anomaly Detection with FOMO-AD - Texas Instruments TDA4VM
    • Surface Crack Detection and Localization - Texas Instruments TDA4VM
    • Surface Crack Detection - Seeed reTerminal
    • Retail Image Classification - Nvidia Jetson Nano
    • SiLabs xG24 Plus Arducam - Sorting Objects with Computer Vision and Robotics - Part 1
    • SiLabs xG24 Plus Arducam - Sorting Objects with Computer Vision and Robotics - Part 2
    • Object Detection and Visualization - Seeed Grove Vision AI Module
    • Bike Rearview Radar - Raspberry Pi
    • Build a Self-Driving RC Vehicle - Arduino Portenta H7 and Computer Vision
    • "Bring Your Own Model" Image Classifier for Wound Identification
    • Acute Lymphoblastic Leukemia Classifier - Nvidia Jetson Nano
    • Hardhat Detection in Industrial Settings - Alif Ensemble E7
    • Motorcycle Helmet Identification and Traffic Light Control - Texas Instruments AM62A
    • Import a Pretrained Model with "Bring Your Own Model" - Texas Instruments AM62A
    • Product Inspection with Visual Anomaly Detection - FOMO-AD - Sony Spresense
    • Visual Anomaly Detection in Fabric using FOMO-AD - Raspberry Pi 5
    • Car Detection and Tracking System for Toll Plazas - Raspberry Pi AI Kit
    • Visual Anomaly Detection - Seeed Grove Vision AI Module V2
    • Object Counting with FOMO - OpenMV Cam RT1062
    • Visitor Heatmap with FOMO Object Detection - Jetson Orin Nano
    • Vehicle Security Camera - Arduino Portenta H7
  • Audio Projects
    • Occupancy Sensing - SiLabs xG24
    • Smart Appliance Control Using Voice Commands - Nordic Thingy:53
    • Glass Window Break Detection - Nordic Thingy:53
    • Illegal Logging Detection - Nordic Thingy:53
    • Illegal Logging Detection - Syntiant TinyML
    • Wearable Cough Sensor and Monitoring - Arduino Nano 33 BLE Sense
    • Collect Data for Keyword Spotting - Raspberry Pi Pico
    • Voice-Activated LED Strip - Raspberry Pi Pico
    • Snoring Detection on a Smart Phone
    • Gunshot Audio Classification - Arduino Nano 33 + Portenta H7
    • AI-Powered Patient Assistance - Arduino Nano 33 BLE Sense
    • Acoustic Pipe Leakage Detection - Arduino Portenta H7
    • Location Identification using Sound - Syntiant TinyML
    • Environmental Noise Classification - Nordic Thingy:53
    • Running Faucet Detection - Seeed XIAO Sense + Blues Cellular
    • Vandalism Detection via Audio Classification - Arduino Nano 33 BLE Sense
    • Predictive Maintenance Using Audio Classification - Arduino Nano 33 BLE Sense
    • Porting an Audio Project from the SiLabs Thunderboard Sense 2 to xG24
    • Environmental Audio Monitoring Wearable - Syntiant TinyML - Part 1
    • Environmental Audio Monitoring Wearable - Syntiant TinyML - Part 2
    • Keyword Spotting - Nordic Thingy:53
    • Detecting Worker Accidents with Audio Classification - Syntiant TinyML
    • Snoring Detection with Syntiant NDP120 Neural Decision Processor - Arduino Nicla Voice
    • Recognize Voice Commands with the Particle Photon 2
    • Voice Controlled Power Plug with Syntiant NDP120 (Nicla Voice)
    • Determining Compressor State with Audio Classification - Avnet RaSynBoard
    • Developing a Voice-Activated Product with Edge Impulse's Synthetic Data Pipeline
    • Enhancing Worker Safety using Synthetic Audio to Create a Dog Bark Classifier
  • Predictive Maintenance and Defect Detection Projects
    • Predictive Maintenance - Nordic Thingy:91
    • Brushless DC Motor Anomaly Detection
    • Industrial Compressor Predictive Maintenance - Nordic Thingy:53
    • Anticipate Power Outages with Machine Learning - Arduino Nano 33 BLE Sense
    • Faulty Lithium-Ion Cell Identification in Battery Packs - Seeed Wio Terminal
    • Weight Scale Predictive Maintenance - Arduino Nano 33 BLE Sense
    • Fluid Leak Detection With a Flowmeter and AI - Seeed Wio Terminal
    • Pipeline Clog Detection with a Flowmeter and AI - Seeed Wio Terminal
    • Refrigerator Predictive Maintenance - Arduino Nano 33 BLE Sense
    • Motor Pump Predictive Maintenance - Infineon PSoC 6 WiFi-BT Pioneer Kit + CN0549
    • BrickML Demo Project - 3D Printer Anomaly Detection
    • Condition Monitoring - Syntiant TinyML Board
    • Predictive Maintenance - Commercial Printer - Sony Spresense + CommonSense
    • Vibration Classification with BrainChip's Akida
    • AI-driven Audio and Thermal HVAC Monitoring - SeeedStudio XIAO ESP32
  • Accelerometer and Activity Projects
    • Arduino x K-Way - Outdoor Activity Tracker
    • Arduino x K-Way - Gesture Recognition for Hiking
    • Arduino x K-Way - TinyML Fall Detection
    • Posture Detection for Worker Safety - SiLabs Thunderboard Sense 2
    • Hand Gesture Recognition - OpenMV Cam H7
    • Arduin-Row, a TinyML Rowing Machine Coach - Arduino Nicla Sense ME
    • Fall Detection using a Transformer Model – Arduino Giga R1 WiFi
    • Bluetooth Fall Detection - Arduino Nano 33 BLE Sense
    • Monitor Packages During Transit with AI - Arduino Nano 33 BLE Sense
    • Smart Baby Swing - Arduino Portenta H7
    • Warehouse Shipment Monitoring - SiLabs Thunderboard Sense 2
    • Gesture Recognition - Bangle.js Smartwatch
    • Gesture Recognition for Patient Communication - SiLabs Thunderboard Sense 2
    • Hospital Bed Occupancy Detection - Arduino Nano 33 BLE Sense
    • Porting a Posture Detection Project from the SiLabs Thunderboard Sense 2 to xG24
    • Porting a Gesture Recognition Project from the SiLabs Thunderboard Sense 2 to xG24
    • Continuous Gait Monitor (Anomaly Detection) - Nordic Thingy:53
    • Classifying Exercise Activities on a BangleJS Smartwatch
  • Air Quality and Environmental Projects
    • Arduino x K-Way - Environmental Asthma Risk Assessment
    • Gas Detection in the Oil and Gas Industry - Nordic Thingy:91
    • Smart HVAC System with a Sony Spresense
    • Smart HVAC System with an Arduino Nicla Vision
    • Indoor CO2 Level Estimation - Arduino Portenta H7
    • Harmful Gases Detection - Arduino Nano 33 BLE Sense
    • Fire Detection Using Sensor Fusion and TinyML - Arduino Nano 33 BLE Sense
    • AI-Assisted Monitoring of Dairy Manufacturing Conditions - Seeed XIAO ESP32C3
    • AI-Assisted Air Quality Monitoring - DFRobot Firebeetle ESP32
    • Air Quality Monitoring with Sipeed Longan Nano - RISC-V Gigadevice
    • Methane Monitoring in Mines - Silabs xG24 Dev Kit
    • Smart Building Ventilation with Environmental Sensor Fusion
    • Sensor Data Fusion with Spresense and CommonSense
    • Water Pollution Detection - Arduino Nano ESP32 + Ultrasonic Scan
    • Fire Detection Using Sensor Fusion - Arduino Nano 33 BLE Sense
  • Novel Sensor Projects
    • 8x8 ToF Gesture Classification - Arduino RP2040 Connect
    • Food Irradiation Dose Detection - DFRobot Beetle ESP32C3
    • Applying EEG Data to Machine Learning, Part 1
    • Applying EEG Data to Machine Learning, Part 2
    • Applying EEG Data to Machine Learning, Part 3
    • Liquid Classification with TinyML - Seeed Wio Terminal + TDS Sensor
    • AI-Assisted Pipeline Diagnostics and Inspection with mmWave Radar
    • Soil Quality Detection Using AI and LoRaWAN - Seeed Sensecap A1101
    • Smart Diaper Prototype - Arduino Nicla Sense ME
    • DIY Smart Glove with Flex Sensors
    • EdgeML Energy Monitoring - Particle Photon 2
    • Wearable for Monitoring Worker Stress using HR/HRV DSP Block - Arduino Portenta
  • Software Integration Demos
    • Azure Machine Learning with Kubernetes Compute and Edge Impulse
    • ROS2 + Edge Impulse, Part 1: Pub/Sub Node in Python
    • ROS2 + Edge Impulse, Part 2: MicroROS
    • Using Hugging Face Datasets in Edge Impulse
    • Using Hugging Face Image Classification Datasets with Edge Impulse
    • Edge Impulse API Usage Sample Application - Jetson Nano Trainer
    • MLOps with Edge Impulse and Azure IoT Edge
    • A Federated Approach to Train and Deploy Machine Learning Models
    • DIY Model Weight Update for Continuous AI Deployments
    • Automate the CI/CD Pipeline of your Models with Edge Impulse and GitHub Actions
    • Deploying Edge Impulse Models on ZEDEDA Cloud Devices
Powered by GitBook
On this page
  • Background
  • Equipment and software
  • Before we get started...
  • Streaming our data to the Edge Impulse studio
  • Creating an impulse
  • Generating features
  • Anomaly Detection
  • Deploying our model
  • Classifying data with Arduino
  • Putting it all together
  • Thanks for reading! Take a look at these videos showing the anomaly detection model in action:
  • - Video of anomaly detection (only)
  • - Full video walkthrough

Was this helpful?

Edit on GitHub
Export as PDF
  1. Predictive Maintenance and Defect Detection Projects

Brushless DC Motor Anomaly Detection

Sample data from a BLDC motor controller and apply machine learning to receive predictive maintenance alerts.

PreviousPredictive Maintenance - Nordic Thingy:91NextIndustrial Compressor Predictive Maintenance - Nordic Thingy:53

Last updated 1 year ago

Was this helpful?

Created By: Avi Brown

Public Project Link:

Background

Brushless DC (BLDC) motors' high performance and efficiency have made them one of the most popular options for industrial / robotics applications. In order to drive BLDC motors a dedicated motor controller is needed, and there exist many controller manufacturers like ODrive and Roboteq, to name a couple.

In addition to offering precise motor control, these controllers often expose a number of performance properties to the engineer, including motor velocity, torque, power consumption, phase current, temperature, and more. This poses an excellent opportunity to use non-intrusive embedded machine learning to add an extra layer of sophistication to the system.

In this project we will learn how to:

Finally we will see how to use the Arduino library generated by Edge Impulse and combine it with our own custom code. Let's go!

Equipment and software

Any motor controller that allows querying power data over serial connection can be used!

Before we get started...

For this tutorial we'll be using motor power as the driving parameter for our predictive maintenance model. Without using any additional sensors and relying on motor power data alone we can create an anomaly detection model that can alert when faults are detected.

In future tutorials we will explore the use of classification models to recognize specific faults such as bearing faults, axle misalignments, axle disjoints, etc. For now we will train our model using nominal, intact data, and use anomaly detection to detect behavior that falls outside of the norm.

What can motor power data tell us?

In this tutorial we will be using spectral analysis to generate features for our anomaly detection model.

Data forwarder

One of Edge Impulse's most convenient tools is their data forwarder. This tool allows you to stream data directly to the Edge Impulse platform regardless of the data source. We will use an Arduino script to create a data stream that the data forwarder can listen to.

We don't need to use any external sensors (no accelerometers or microphones here!) - we can use the motor controller's built-in parameters and circuitry to gather powerful data.

Collecting data

We need to collect data to train our machine learning model. For this tutorial we will be recording only nominal performance. That means there is no need to perform fault simulation in order to detect "faulty" data, which can be an intrusive and dangerous process. The anomaly detection model will tell us when the power signal from the motor is behaving in a way that the model hasn't seen before.

Let's take a look at 1 second of power consumption data from this motion:

Since the commanded motor velocity is constantly changing, so is the motor power signal.

Streaming our data to the Edge Impulse studio

Let's set our motor in motion (as shown in the video above) and click "Start sampling" to begin sending data to the Edge Impulse platform. Feel free to leave this running for a long time -- the more data the better!

Creating an impulse

It's time to create an impulse, which is comprised of a series of configuration / processing blocks. First up is setting the type of data we're using, which in this case is "Time series data". This block will split our long samples into shorter windows. Next we decide what type of processing we want performed on our data. In this tutorial we'll be using "Spectral analysis", which looks at the behavior of the signals in the frequency domain. Finally we decide what sort of neural network we want to feed the results of the spectral analysis to. For this we will select "Anomaly detection (K-means)":

This impulse is the heart of our model. Once we click "Save Impulse" we can move to the "Spectral analysis" screen (select from menu on the left).

Spectral analysis

When we train our machine learning model we're not actually feeding raw, signal level samples to the model, rather we feed it features generated by digital signal processing. Using spectral analysis we can create sets of information about how our signal behaves in the frequency domain.

After we click "Save Impulse", let's navigate to the "Spectral analysis" window. Here we can make adjustments to the DSP block in our impulse. Among other things we can set filter types and immediately view the filtered data. The default filter setting is a low pass filter, but this can and should be adjusted according to the type of anomalies the engineer is trying to detect.

Once we're happy with our DSP block settings, we can click "Save parameters" and then navigate to the next screen - "Generate features".

Generating features

Now we're ready to apply the signal processing to our data. In the "Generate features" screen, click "Generate features" and wait a few moments for Edge Impulse to create spectral features for each one of the samples:

Anomaly Detection

We're ready to move on to the next block where we create our machine learning model. We're almost done! Once we've generated the DSP features we can navigate to the next screen "Anomaly detection" from the menu on the left.

In the graph above, each sample in represented by a dot, and the surrounding circles represent the clusters. These clusters can be thought of as regions of typical behavior. Our model will notify us not only when a new sample falls outside of the clusters, but by how much!

Deploying our model

Our model is ready for deployment in the form of our choosing! For this tutorial we'll export our model as an Arduino library that we can invoke from within our custom Arduino scripts. Navigate to the "Deployment" screen, select "Arduino library", and hit "Build" on the bottom of the screen.

Classifying data with Arduino

Using the Arduino IDE we will need to import the .ZIP folder into the Arduino library folder using Sketch -> Include library... -> Add .ZIP folder... Once we add the .ZIP folder from Edge Impulse we can import our inferencing library like this:

#include <your_project_inference.h>

(You can peek inside the project .ZIP folder inside src to see the exact name of the header file!)

Putting it all together

Thanks for reading! Take a look at these videos showing the anomaly detection model in action:

Collect data from an (though any motor controller that allows querying power data can be used)

Import our data into the using the Data forwarder

Discover how to create a K-means anomaly detection model that is small enough to run on a

Most of the magic is going to happen in the Edge Impulse studio, so if you're following along be sure to open a ! For those who don't know, Edge Impulse is an engineer-focused platform designed to streamline the process of building machine learning models for edge / embedded devices.

Next, for our motor setup we will be using an with a brushless motor (D5065) from the same company. ODrive manufactures affordable yet robust motor control hardware that can be interfaced with via USB or UART serial connection.

Finally, for our main computing unit we will be using a Raspberry Pi . This tiny board packs an RP2040 (Arm Cortex-M0+) microcontroller. Be sure to check out Edge Impulse's on this board.

...let's define our goal here. This project is meant to serve as a "get started" reference for leveraging Edge Impulse and to perform industrial motor predictive maintenance (PdM). From a quick glance at the it is plain to see that there are plenty of data elements that we could choose to create features and build our machine learning model, but regardless of what you choose for your specific use case the steps should be similar.

Ohm's law tells us that power = current * voltage, therefore tracking a motor's power consumption allows us to consider the behavior of both current and voltage concurrently. A popular method used to monitor motor behavior is "Instantaneous Power Signature Analysis", or IPSA. Essentially a motor's power is analyzed in the frequency domain in order to uncover external interference - whether mechanical or electrical. You can read more about IPSA in this academic article: [page 500].

It's recommended to check out the official guide on Edge Impulse's data forwarder , and check out the ODrive specific data forwarding Arduino script attached to this tutorial ("odrive_data_forwarding.ino")!

We'll run the attached code on our Arduino while it's connected via UART to our ODrive board. You can read about the ODrive UART interface

There are plenty of ways to upload training data using Edge Impulse. For example, it's possible to with the relevant data, or you can use the convenient to stream data live over serial, which is what we'll be doing in this tutorial.

Let's say we set a motor rotating with random changes in velocity in order to simulate the motion of a robotic arm or something of the like. of what that might look like.

Undisturbed pseudo-random motor movement

Assuming you've gone through the and have with Edge Impulse, go ahead and make a new project.

Following the for using Edge Impulse's data forwarder we should see our virtual device appear when we click Data acquisition:

On this screen we can set the number of clusters, as well as select the axes according to which our data will be clustered. For this example all axes were selected, but if you know that certain axes are more / less important it's best to select them accordingly (this can be determined by using samples where the motor is experiencing faulty behavior and using the Calculate feature importance _option in the Generate features section. More on this .)

It's recommended to follow guide to learn how to call your Edge Impulse model from within Arduino code. The example in the guide is for use with an accelerometer, but the principle is the same!

Combining the example code from the guide referenced above with custom code for gathering data from our ODrive (or whatever motor controller you happen to be using!), we'll end up with something like . Please feel free to ask questions about the code if something is unclear.

- of anomaly detection (only)

- Full

ODrive motor controller
Edge Impulse studio
Raspberry Pi Pico
free account
ODrive V3.6 (24V) controller
Pico
official guide
TinyML
ODrive developer reference document
Predictive Maintenance by Electrical Signature Analysis to Induction Motors. 10.5772/48045. Bonaldi, E.L. & Oliveira, Levy & Borges da Silva, Jonas & Lambert-Torres, Germano & SILVA, L.E.. (2012).
here
here.
upload .CSV files
Edge Impulse Data Forwarder
Here's a video
Getting Started guide
made an account
guide
here
this
this
Video
video walkthrough
https://studio.edgeimpulse.com/public/102584/latest