LogoLogo
HomeDocsAPIProjectsForum
  • Getting Started
    • For beginners
    • For ML practitioners
    • For embedded engineers
  • Frequently asked questions
  • Tutorials
    • End-to-end tutorials
      • Continuous motion recognition
      • Responding to your voice
      • Recognize sounds from audio
      • Adding sight to your sensors
        • Collecting image data from the Studio
        • Collecting image data with your mobile phone
        • Collecting image data with the OpenMV Cam H7 Plus
      • Object detection
        • Detect objects using MobileNet SSD
        • Detect objects with FOMO
      • Sensor fusion
      • Sensor fusion using Embeddings
      • Processing PPG input with HR/HRV Features Block
      • Industrial Anomaly Detection on Arduino® Opta® PLC
    • Advanced inferencing
      • Continuous audio sampling
      • Multi-impulse
      • Count objects using FOMO
    • API examples
      • Running jobs using the API
      • Python API Bindings Example
      • Customize the EON Tuner
      • Ingest multi-labeled data using the API
      • Trigger connected board data sampling
    • ML & data engineering
      • EI Python SDK
        • Using the Edge Impulse Python SDK with TensorFlow and Keras
        • Using the Edge Impulse Python SDK to run EON Tuner
        • Using the Edge Impulse Python SDK with Hugging Face
        • Using the Edge Impulse Python SDK with Weights & Biases
        • Using the Edge Impulse Python SDK with SageMaker Studio
        • Using the Edge Impulse Python SDK to upload and download data
      • Label image data using GPT-4o
      • Label audio data using your existing models
      • Generate synthetic datasets
        • Generate image datasets using Dall·E
        • Generate keyword spotting datasets
        • Generate physics simulation datasets
        • Generate audio datasets using Eleven Labs
      • FOMO self-attention
    • Lifecycle Management
      • CI/CD with GitHub Actions
      • OTA Model Updates
        • with Nordic Thingy53 and the Edge Impulse APP
      • Data Aquisition from S3 Object Store - Golioth on AI
    • Expert network projects
  • Edge Impulse Studio
    • Organization hub
      • Users
      • Data campaigns
      • Data
      • Data transformation
      • Upload portals
      • Custom blocks
        • Transformation blocks
        • Deployment blocks
          • Deployment metadata spec
      • Health Reference Design
        • Synchronizing clinical data with a bucket
        • Validating clinical data
        • Querying clinical data
        • Transforming clinical data
        • Buildling data pipelines
    • Project dashboard
      • Select AI Hardware
    • Devices
    • Data acquisition
      • Uploader
      • Data explorer
      • Data sources
      • Synthetic data
      • Labeling queue
      • AI labeling
      • CSV Wizard (Time-series)
      • Multi-label (Time-series)
      • Tabular data (Pre-processed & Non-time-series)
      • Metadata
      • Auto-labeler [Deprecated]
    • Impulse design & Experiments
    • Bring your own model (BYOM)
    • Processing blocks
      • Raw data
      • Flatten
      • Image
      • Spectral features
      • Spectrogram
      • Audio MFE
      • Audio MFCC
      • Audio Syntiant
      • IMU Syntiant
      • HR/HRV features
      • Building custom processing blocks
        • Hosting custom DSP blocks
      • Feature explorer
    • Learning blocks
      • Classification (Keras)
      • Anomaly detection (K-means)
      • Anomaly detection (GMM)
      • Visual anomaly detection (FOMO-AD)
      • Regression (Keras)
      • Transfer learning (Images)
      • Transfer learning (Keyword Spotting)
      • Object detection (Images)
        • MobileNetV2 SSD FPN
        • FOMO: Object detection for constrained devices
      • NVIDIA TAO (Object detection & Images)
      • Classical ML
      • Community learn blocks
      • Expert Mode
      • Custom learning blocks
    • EON Tuner
      • Search space
    • Retrain model
    • Live classification
    • Model testing
    • Performance calibration
    • Deployment
      • EON Compiler
      • Custom deployment blocks
    • Versioning
  • Tools
    • API and SDK references
    • Edge Impulse CLI
      • Installation
      • Serial daemon
      • Uploader
      • Data forwarder
      • Impulse runner
      • Blocks
      • Himax flash tool
    • Edge Impulse for Linux
      • Linux Node.js SDK
      • Linux Go SDK
      • Linux C++ SDK
      • Linux Python SDK
      • Flex delegates
    • Edge Impulse Python SDK
  • Run inference
    • C++ library
      • As a generic C++ library
      • On your desktop computer
      • On your Zephyr-based Nordic Semiconductor development board
    • Linux EIM Executable
    • WebAssembly
      • Through WebAssembly (Node.js)
      • Through WebAssembly (browser)
    • Docker container
    • Edge Impulse firmwares
  • Edge AI Hardware
    • Overview
    • MCU
      • Nordic Semi nRF52840 DK
      • Nordic Semi nRF5340 DK
      • Nordic Semi nRF9160 DK
      • Nordic Semi nRF9161 DK
      • Nordic Semi nRF9151 DK
      • Nordic Semi nRF7002 DK
      • Nordic Semi Thingy:53
      • Nordic Semi Thingy:91
    • CPU
      • macOS
      • Linux x86_64
    • Mobile Phone
    • Porting Guide
  • Integrations
    • Arduino Machine Learning Tools
    • NVIDIA Omniverse
    • Embedded IDEs - Open-CMSIS
    • Scailable
    • Weights & Biases
  • Pre-built datasets
    • Continuous gestures
    • Running faucet
    • Keyword spotting
    • LiteRT (Tensorflow Lite) reference models
  • Tips & Tricks
    • Increasing model performance
    • Data augmentation
    • Inference performance metrics
    • Optimize compute time
    • Adding parameters to custom blocks
    • Combine Impulses
  • Concepts
    • Glossary
    • Data Engineering
      • Audio Feature Extraction
      • Motion Feature Extraction
    • ML Concepts
      • Neural Networks
        • Layers
        • Activation Functions
        • Loss Functions
        • Optimizers
          • Learned Optimizer (VeLO)
        • Epochs
      • Evaluation Metrics
    • Edge AI
      • Introduction to edge AI
      • What is edge computing?
      • What is machine learning (ML)?
      • What is edge AI?
      • How to choose an edge AI device
      • Edge AI lifecycle
      • What is edge MLOps?
      • What is Edge Impulse?
      • Case study: Izoelektro smart grid monitoring
      • Test and certification
    • What is embedded ML, anyway?
    • What is edge machine learning (edge ML)?
Powered by GitBook
On this page
  • Why Edge Impulse, for ML practitioners?
  • Getting started in a few steps
  • Tutorials and resources, for ML practitioners
  1. Getting Started

For ML practitioners

PreviousFor beginnersNextFor embedded engineers

Last updated 6 months ago

Welcome to Edge Impulse! Whether you are a machine learning engineer, MLOps engineer, data scientist, or researcher, we have developed professional tools to help you build and optimize models to run efficiently on any edge device.

In this guide, we'll explore how Edge Impulse empowers you to bring your expertise and your own models to the world of edge AI using either the Edge Impulse Studio, our visual interface, and the Edge Impulse Python SDK, available as a pip package.

Why Edge Impulse, for ML practitioners?

Flexibility: You can choose to work with the tools they are already familiar with and import your models, architecture, and feature processing algorithms into the platform. This means that you can leverage your existing knowledge and workflows seamlessly. Or, for those who prefer an all-in-one solution, Edge Impulse provides enterprise-grade tools for your entire machine-learning pipeline.

Optimized for edge devices: Edge Impulse is designed specifically for deploying machine learning models on edge devices, which are typically resource-constrained, from low-power MCUs up to powerful edge GPUs. We provide tools to optimize your models for edge deployment, ensuring efficient resource usage and peak performance. Focus on developing the best models, we will provide feedback on whether they can run on your hardware target!

Data pipelines: We developed a strong expertise in complex data pipelines (including clinical data) while working with our customers. We support data coming from multiple sources, in any format, and provide tools to perform data alignment and validation checks. All of this in customizable multi-stage pipelines. This means you can build gold-standard labeled datasets that can then be imported into your project to train your models.

Getting started in a few steps

In this getting started guide, we'll walk you through the two different approaches to bringing your expertise to edge devices. Either starting from your dataset or from an existing model.

Start with existing data

We currently accept various file types, including .cbor, .json, .csv, .wav, .jpg, .png, .mp4, and .avi.

Organization data

Since the creation of Edge Impulse, we have been helping our customers deal with complex data pipelines, complex data transformation methods and complex clinical validation studies.

The organizational data gives you tools to centralize, validate and transform datasets so they can be easily imported into your projects.

Each block will provide on-device performance information showing you the estimated RAM, flash, and latency.

Start with an existing model

If you already have been working on different models for your Edge AI applications, Edge Impulse offers an easy way to upload your models and profile them. This way, in just a few minutes, you will know if your model can run on real devices and what will be the on-device performances (RAM, flash usage, and latency).

Edge Impulse Python SDK is available as a pip package:

python -m pip install edgeimpulse

From there, you can profile your existing models:

import edgeimpulse as ei
ei.API_KEY = "ei_dae27..."
profile = ei.model.profile(model=model, device='cortex-m4f-80mhz')
print(profile.summary())
ei.model.deploy(model=model, 
                model_output_type=ei.model.output_type.Classification(),
                deploy_target='zip')

Run the inference on a device

If you target MCU-based devices, you can generate ready-to-flash binaries for all the officially supported hardware targets. This method will let you test your model on real hardware very quickly.

In both cases, we will provide profiling information about your models so you can make sure your model will fit your edge device constraints.

Tutorials and resources, for ML practitioners

End-to-end tutorials

Edge Impulse Python SDK tutorials

Other useful resources

Integrations

First, start by creating your .

You can import data using , , or our . These allow you to easily upload and manage your existing data samples and datasets to Edge Impulse Studio.

If you are working with image datasets, the Studio uploader and the CLI uploader currently handle these types of : Edge Impulse object detection, COCO JSON, Open Images CSV, Pascal VOC XML, Plain CSV, and YOLO TXT.

See the documentation.

To visualize how your labeled data items are clustered, use the feature available for most dataset types, where we apply dimensionality reduction techniques (t-SNE or PCA) on your embeddings.

To extract features from your data items, either choose an available (MFE, MFCC, spectral analysis using FFT or Wavelets, etc.) or from your expertise. These can be written in any language.

Similarly, to train your machine learning model, you can choose from different (Classification, Anomaly Detection, Regression, Image or Audio Transfer Learning, Object Detection). In most of these blocks, we expose the Keras API in an . You can also bring your own architecture/training pipeline as a .

You can do this directly from the or using .

And then directly generate a customizable library or any other supported

You can easily in a .eim format, a Linux executable that contains your signal processing and ML code, compiled with optimizations for your processor or GPU. This executable can then be called with our . We have inferencing libraries and examples for Python, Node.js, C++, and Go.

If you want to get familiar with the full end-to-end flow using Edge Impulse Studio, please have a look at our on , , , , or .

To understand the full potential of Edge Impulse, see our that describes an end-to-end ML workflow for building a wearable health product using Edge Impulse. It handles data coming from multiple sources, data alignment, and a multi-stage pipeline before the data is imported into an Edge Impulse project.

While the Edge Impulse Studio is a great interface for guiding you through the process of collecting data and training a model, the Python SDK allows you to programmatically Bring Your Own Model (BYOM), developed and trained on any platform:

(access Keras API in the studio)

(access state-of-the-art pre-trained models)

Edge Impulse account
Studio Uploader
CLI Uploader
Ingestion API
Organization data
Data explorer
processing block
create your own
Studio BYOM feature
Edge Impulse Python SDK
deployment type
export your model
Linux inferencing libraries
end-to-end tutorials
continuous motion recognition
responding to your voice
recognizing sounds from audio
adding sight to your sensors
object detection
health reference design
edgeimpulse
Using the Edge Impulse Python SDK with TensorFlow and Keras
Using the Edge Impulse Python SDK with Hugging Face
Using the Edge Impulse Python SDK with Weights & Biases
Using the Edge Impulse Python SDK with SageMaker Studio
BYOM (Bring Your Own Model)
Custom learning blocks (Bring Your Own Architecture)
Generate synthetic datasets
Weight & Biases
NVIDIA TAO Toolkit
NVIDIA Omniverse
dataset annotation formats
learning blocks
custom learning block
expert mode
Expert mode