LogoLogo
HomeDocsAPIProjectsForum
  • Getting Started
    • For beginners
    • For ML practitioners
    • For embedded engineers
  • Frequently asked questions
  • Tutorials
    • End-to-end tutorials
      • Continuous motion recognition
      • Responding to your voice
      • Recognize sounds from audio
      • Adding sight to your sensors
        • Collecting image data from the Studio
        • Collecting image data with your mobile phone
        • Collecting image data with the OpenMV Cam H7 Plus
      • Object detection
        • Detect objects using MobileNet SSD
        • Detect objects with FOMO
      • Sensor fusion
      • Sensor fusion using Embeddings
      • Processing PPG input with HR/HRV Features Block
      • Industrial Anomaly Detection on Arduino® Opta® PLC
    • Advanced inferencing
      • Continuous audio sampling
      • Multi-impulse
      • Count objects using FOMO
    • API examples
      • Running jobs using the API
      • Python API Bindings Example
      • Customize the EON Tuner
      • Ingest multi-labeled data using the API
      • Trigger connected board data sampling
    • ML & data engineering
      • EI Python SDK
        • Using the Edge Impulse Python SDK with TensorFlow and Keras
        • Using the Edge Impulse Python SDK to run EON Tuner
        • Using the Edge Impulse Python SDK with Hugging Face
        • Using the Edge Impulse Python SDK with Weights & Biases
        • Using the Edge Impulse Python SDK with SageMaker Studio
        • Using the Edge Impulse Python SDK to upload and download data
      • Label image data using GPT-4o
      • Label audio data using your existing models
      • Generate synthetic datasets
        • Generate image datasets using Dall·E
        • Generate keyword spotting datasets
        • Generate physics simulation datasets
        • Generate audio datasets using Eleven Labs
      • FOMO self-attention
    • Lifecycle Management
      • CI/CD with GitHub Actions
      • OTA Model Updates
        • with Nordic Thingy53 and the Edge Impulse APP
      • Data Aquisition from S3 Object Store - Golioth on AI
    • Expert network projects
  • Edge Impulse Studio
    • Organization hub
      • Users
      • Data campaigns
      • Data
      • Data transformation
      • Upload portals
      • Custom blocks
        • Transformation blocks
        • Deployment blocks
          • Deployment metadata spec
      • Health Reference Design
        • Synchronizing clinical data with a bucket
        • Validating clinical data
        • Querying clinical data
        • Transforming clinical data
        • Buildling data pipelines
    • Project dashboard
      • Select AI Hardware
    • Devices
    • Data acquisition
      • Uploader
      • Data explorer
      • Data sources
      • Synthetic data
      • Labeling queue
      • AI labeling
      • CSV Wizard (Time-series)
      • Multi-label (Time-series)
      • Tabular data (Pre-processed & Non-time-series)
      • Metadata
      • Auto-labeler [Deprecated]
    • Impulse design & Experiments
    • Bring your own model (BYOM)
    • Processing blocks
      • Raw data
      • Flatten
      • Image
      • Spectral features
      • Spectrogram
      • Audio MFE
      • Audio MFCC
      • Audio Syntiant
      • IMU Syntiant
      • HR/HRV features
      • Building custom processing blocks
        • Hosting custom DSP blocks
      • Feature explorer
    • Learning blocks
      • Classification (Keras)
      • Anomaly detection (K-means)
      • Anomaly detection (GMM)
      • Visual anomaly detection (FOMO-AD)
      • Regression (Keras)
      • Transfer learning (Images)
      • Transfer learning (Keyword Spotting)
      • Object detection (Images)
        • MobileNetV2 SSD FPN
        • FOMO: Object detection for constrained devices
      • NVIDIA TAO (Object detection & Images)
      • Classical ML
      • Community learn blocks
      • Expert Mode
      • Custom learning blocks
    • EON Tuner
      • Search space
    • Retrain model
    • Live classification
    • Model testing
    • Performance calibration
    • Deployment
      • EON Compiler
      • Custom deployment blocks
    • Versioning
  • Tools
    • API and SDK references
    • Edge Impulse CLI
      • Installation
      • Serial daemon
      • Uploader
      • Data forwarder
      • Impulse runner
      • Blocks
      • Himax flash tool
    • Edge Impulse for Linux
      • Linux Node.js SDK
      • Linux Go SDK
      • Linux C++ SDK
      • Linux Python SDK
      • Flex delegates
    • Edge Impulse Python SDK
  • Run inference
    • C++ library
      • As a generic C++ library
      • On your desktop computer
      • On your Zephyr-based Nordic Semiconductor development board
    • Linux EIM Executable
    • WebAssembly
      • Through WebAssembly (Node.js)
      • Through WebAssembly (browser)
    • Docker container
    • Edge Impulse firmwares
  • Edge AI Hardware
    • Overview
    • MCU
      • Nordic Semi nRF52840 DK
      • Nordic Semi nRF5340 DK
      • Nordic Semi nRF9160 DK
      • Nordic Semi nRF9161 DK
      • Nordic Semi nRF9151 DK
      • Nordic Semi nRF7002 DK
      • Nordic Semi Thingy:53
      • Nordic Semi Thingy:91
    • CPU
      • macOS
      • Linux x86_64
    • Mobile Phone
    • Porting Guide
  • Integrations
    • Arduino Machine Learning Tools
    • NVIDIA Omniverse
    • Embedded IDEs - Open-CMSIS
    • Scailable
    • Weights & Biases
  • Pre-built datasets
    • Continuous gestures
    • Running faucet
    • Keyword spotting
    • LiteRT (Tensorflow Lite) reference models
  • Tips & Tricks
    • Increasing model performance
    • Data augmentation
    • Inference performance metrics
    • Optimize compute time
    • Adding parameters to custom blocks
    • Combine Impulses
  • Concepts
    • Glossary
    • Data Engineering
      • Audio Feature Extraction
      • Motion Feature Extraction
    • ML Concepts
      • Neural Networks
        • Layers
        • Activation Functions
        • Loss Functions
        • Optimizers
          • Learned Optimizer (VeLO)
        • Epochs
      • Evaluation Metrics
    • Edge AI
      • Introduction to edge AI
      • What is edge computing?
      • What is machine learning (ML)?
      • What is edge AI?
      • How to choose an edge AI device
      • Edge AI lifecycle
      • What is edge MLOps?
      • What is Edge Impulse?
      • Case study: Izoelektro smart grid monitoring
      • Test and certification
    • What is embedded ML, anyway?
    • What is edge machine learning (edge ML)?
Powered by GitBook
On this page
  • Artificial intelligence vs. machine learning
  • Modern machine learning
  • The Internet of Things
  • Edge and embedded machine learning
  • Edge ML use cases
  • Learn more
  1. Concepts

What is edge machine learning (edge ML)?

Edge machine learning is a fast growing field. This article looks at the history of machine learning and how it is being applied to Internet of Things (IoT) devices to save power, reduce latency, and

PreviousWhat is embedded ML, anyway?

Last updated 6 months ago

Edge machine learning (edge ML) is the process of running machine learning algorithms on computing devices at the periphery of a network to make decisions and predictions as close as possible to the originating source of data. It is also referred to as edge artificial intelligence or edge AI.

In traditional machine learning, we often find large servers processing heaps of data collected from the Internet to provide some benefit, such as predicting what movie to watch next or to label a cat video automatically. By running machine learning algorithms on edge devices like laptops, smartphones, and embedded systems (such as those found in smartwatches, washing machines, cars, manufacturing robots, etc.), we can produce such predictions faster and without the need to transmit large amounts of raw data across a network.

To accurately describe edge ML, we first need to understand the history of artificial intelligence (AI).

Artificial intelligence vs. machine learning

From these definitions, we can view deep learning as a subset of machine learning, which is a subset of artificial intelligence. As a result, all DL algorithms can be considered ML and AI. However, not all AI is ML.

Since the early days of AI, advances in algorithms, software, and hardware have allowed us to begin using machine learning in helpful and unique ways.

Modern machine learning

Since then, AI has soared in popularity, mostly due to the research and development of complex deep neural networks. Powerful graphics cards and server clusters could be employed to speed up the training and inference processes required for deep learning.

These powerful algorithms are used everyday to perform a variety of helpful tasks, such as:

  • Image and video labeling

  • Speech recognition and synthesis

  • Language translation

  • Product and content recommendations

  • Email spam filtering

  • Credit card fraud detection

  • Market and customer segmentation

  • Stock market trading

To train these complex machine learning models, we need enormous amounts of data. Thanks to the Internet, that data can be readily obtained by sharing pre-made datasets or through actively collecting information in real time (e.g. usage statistics of a website). If we want to collect data from the world around us, we need to rely on sensors.

The Internet of Things

The Internet of Things (IoT) is the collection of sensors, hardware devices, and software that exchange information with other devices and computers across communication networks. We often think of IoT as a series of sensors with WiFi or Bluetooth connectivity that can relay to us information about the environment.

Machine learning offers the ability to create further advancements in automation by introducing models that can make predictions or decisions without human intervention. Due to the complex nature of many machine learning algorithms, the traditional integration of IoT and ML involves sending raw sensor data to a central server, which performs the necessary inference calculations to generate a prediction.

For low volumes of raw data and complex models, this configuration may be acceptable. However, there are several potential issues that arise:

  • Transmitting large sensor data, such as images, may hog network bandwidth

  • Transmitting data also requires power

  • The sensors require constant connection to the server to provide near real time ML computations

Edge computing includes personal computers and smartphones in addition to embedded systems (such as those that comprise the Internet of things). To make all of these devices smarter and less reliant on backend servers, we turn to edge machine learning.

Edge and embedded machine learning

Advances in hardware and machine learning have paved the way for running deep ML models efficiently on edge devices. Complex tasks, such as object detection, natural language processing, and model training, still require powerful computers. In these cases, raw data is often collected and sent to a server for processing.

However, performing ML on low-power devices offers a variety of benefits:

  • Less network bandwidth is spent on transmitting raw data

  • While some information may need to be transmitted over a network (e.g. inference results), less communication often means reduced power usage

  • Prediction results are available immediately without the need to send them across a network

  • Inference can be performed without a connection to a network

  • User privacy is ensured, as data is only stored long enough to perform inference (not including data collected for model training)

In most cases, training a machine learning model is more computational intensive than performing inference.

Model: the mathematical formula that attempts to generalize information from a given set of data.

Training: the process of automatically updating the parameters in a model from data. The model “learns” to draw conclusions and make generalizations about the data.

Inference: the process of providing new, unseen data to a trained model to make a prediction, decision, or classification about the new data.

As a result, we often rely on powerful server farms to train new models. This requires collecting data from the field (with sensors, scraping Internet images, etc.) to construct a dataset and using that dataset to train our machine learning model.

Note that in some cases, we can perform on-device training. However, this is often infeasible due to the memory and processing limitations of such edge devices.

Once we have a trained model, which is just a mathematical model (in the form of a software library), we can deploy it to our smart sensor or other edge device. We can write firmware or software using the model to gather new raw sensor readings, perform inference, and take some action based on those inference results.

Such actions might be autonomously driving a car, moving a robotic arm, or sending a notification of a faulty motor to a user. Because inference is performed locally on the edge device, the device does not need to maintain a network connection (optional connection shown as a dotted line in the diagram).

ML models are not perfect. They provide a generalization of the training data. In other words, the model is only as good as the data used to train it. As a result, machine learning (and the subsequent field of data-driven engineering) will not replace traditional programming. However, it nicely complements other types of software engineering, and it opens new possibilities for solving difficult problems.

Edge ML use cases

The ability to run machine learning on edge devices without the need to maintain a connection to a more powerful computer allows for a variety of automation tools and smarter IoT systems. Here are a few examples where edge ML is enabling innovation in various industries.

Agriculture

Smart buildings

  • Security sensors that listen for the unique sound signature of glass breaking

Environment conservation

Health and fitness

Human-computer interaction (HCI)

  • Keyword spotting and wake word detection to control household appliances

  • Gesture control as assistive technology

Industry

  • Predictive maintenance that identities faults in machinery before larger problems arise

The computational power required to perform machine learning at the edge is generally much higher than simply needing to poll a sensor and transmit raw data. However, performing such calculations locally often requires less electrical power than transmitting the raw data to a remote server.

The following chart offers some insights into the types of hardware required to perform machine learning inference at the edge depending on the desired application.

Edge ML is enabling technologies in new areas and allowing for novel solutions to problems. Some of these applications will be visible to consumers (such as keyword spotting on smart speakers) while others will be transforming our lives in invisible ways (such as smart grids delivering power more efficiently).

Learn more

Check out our to learn more about edge computing, the difference between AI and machine learning, and edge MLOps.

The name “artificial intelligence” originates from a by John McCarty, Marvin Minsky, Nathaniel Rochester, and Claude Shannon to host a summer research conference exploring the possibility of programming computers to “simulate many of the higher functions of the human brain.”

Many years later, as “the science and engineering of making intelligent machines, especially intelligent computer programs,” where the definition of intelligence is “the computational part of the ability to achieve goals in the world.” From this definition, we see that AI is an extremely broad field of study involving the use of computers to make decisions to achieve arbitrary goals.

AI researcher Arthur Samuel trained a computer to play checkers better than most humans by having the program play thousands of games against itself and learning from each iteration. He coined the term “machine learning” in his to mean any program that can learn from experience.

The term “deep learning” (DL) comes from a by the mathematician and computer scientist Rina Dechter. She used the term to describe ML models that can be trained to automatically learn features or representations. We often use the term deep learning to describe , but it can be used more broadly to refer to other forms of machine learning.

In the early 2010s, the worked to make deep learning more accessible, which resulted in the creation of the popular framework. The team made headlines in 2012 when they created a model that could accurately classify an image as “cat or not cat.”

In 1982, a few graduate students in the computer science department at Carnegie-Mellon for fun. The machine would display its temperature and various soda stock in real time to a web page. This project is the first known instance of IoT.

For many years, IoT was known as “” (M2M). It involved connecting sensors and automating control processes between various computing devices, and it saw wide adoption in industrial machines and processes.

To counter the need to transmit large amounts of raw data across networks, data storage and some computations can be accomplished on devices closer to the user or sensor, known as the “edge.” Qualcomm’s Karim Arabi, in his 2014 IEEE DAC keynote and , defined edge computing as all computing happening outside of the cloud. Edge computing stands in contrast to , where remote data and services are available on demand to users.

Edge ML includes personal computers, smartphones, and embedded systems. As a result, , also known as , is a subset of edge ML that focuses on running machine learning algorithms on embedded systems, such as microcontrollers and headless single board computers.

for recording crop trials

that can adapt to the number of people in a room

that looks for early faults in power lines

that can identify diseases from images

that automatically detect the presence of hard hats

Edge Impulse is the leading development platform for machine learning on edge devices. You can try creating your own wake word system in five minutes .

Introduction to Edge AI course
proposal in 1956
McCarthy would define AI
1959 paper
1986 paper
artificial neural networks with more than a few layers
Google Brain team
TensorFlow
connected a Coca-Cola vending machine to the Internet
machine to machine
2015 MIT MTL Seminar
cloud computing
embedded ML
tinyML
Automatically identifying irrigation requirements
ML-powered robots
Smart HVAC systems
Smart grid monitoring
Wildlife tracking
Portable medical devices
Digital Health Solution Guide
Safety systems
here
Artificial intelligence vs. machine learning vs. deep learning
Graphics cards
Graphic depicting the Internet of Things (IoT)
Data flow diagram for machine learning with connected sensors
Machine learning model training in the cloud using data from IoT sensors
How a machine learning model is deployed to an edge device
Example of an edge machine learning device used to monitor power lines
Chart showing machine learning use cases with their respective hardware requirements