Multi-impulse vs multi-model vs sensor fusionRunning multi-impulse refers to running two separate projects (different data, different DSP blocks and different models) on the same target. It will require modifying some files in the EI-generated SDKs. See the multi-impulse tutorialRunning multi-model refers to running two different models (same data, same DSP block but different tflite models) on the same target. See how to run a motion classifier model and an anomaly detection model on the same device in this tutorial.Sensor fusion refers to the process of combining data from different types of sensors to give more information to the neural network. To extract meaningful information from this data, you can use the same DSP block (like in this tutorial), multiples DSP blocks, or use neural networks embeddings like this sensor fusion using Embeddings tutorial.
1. Prerequisites
For this tutorial, you’ll need a supported device.2. Building a dataset
For this demo, we’ll show you how to identify different environments by using a fusion of temperature, humidity, pressure, and light data. In particular, I’ll have the Arduino board identify different rooms in my house as well as outside. Note that the we assume that the environment is static—if I turn out lights or the outside temperature changes, the model will not work. However, it demonstrates how we can combine different sensor data with machine learning to do classification! As we will be collecting data from our Arduino board connected to a computer, it helps to have a laptop that you can move to different rooms. Create a new project on the Edge Impulse studio. Connect the Arduino Nano 33 BLE to your computer. Follow the Arduino Nano 33 BLE Sense tutorial to upload the Edge Impulse firmware to the board and connect it to your project.
Arduino board connected to Edge Impulse project
bedroom
. Change Sensor to Environmental + Interactional
, set the Sample length to 10000
ms and Frequency to 12.5Hz
.

Record data from multiple sensors

Raw sensor readings
VariationsTry to stand in different parts of each room while collecting data.
- Bedroom
- Hallway
- Outside

Data split into training and testing sets
4. Design an Impulse
An impulse is a combination of preprocessing (DSP) blocks followed by machine learning blocks. It will slice up our data into smaller windows, use signal processing to extract features, and then train a machine learning model. Because we are using environmental and light data, which are slow-moving averages, we will use the Flatten block for preprocessing. Head to Create impulse. Change the Window increase to500 ms
. Add a Flatten block. Notice that you can choose which environmental and interactional sensor data to include. Deselect proximity and gesture, as we won’t need those to detect rooms. Add a Classification (Keras) learning block

Impulse designed to work with sensor fusion
5. Configure the Flatten block
Head to Flatten. You can select different samples and move the window around to see what the DSP result will look like for each set of features to be sent to the learning block.
View processed features from one sample

View groupings of the most prominent features

View the most important features
6. Configure the neural network
With our dataset collected and features processed, we can train our machine learning model. Click on NN Classifier. Change the Number of training cycles to300
and click Start training. We will leave the neural network architecture as the default for this demo.

Neural network architecture

Confusion matrix of the validation set
7. Model testing
Rather than simply assume that our model will work when deployed, we can run inference on our test dataset as well as on live data. First, head to Model testing, and click Classify all. After a few moments, you should see results from your test set.
Results from running inference on the test set

View detailed classification results from a test sample

Classify live data

Live classification results
8. Running the impulse on your device
Now that we have an impulse with a trained model and we’ve tested its functionality, we can deploy the model back to our device. This means the impulse can run locally without an internet connection to perform inference! Edge Impulse can package up the entire impulse (preprocessing block, neural network, and classification code) into a single library that you can include in your embedded software. Click on Deployment in the menu. Select the library that you would like to create, and click Build at the bottom of the page.
Deploy a trained machine learning model to any number of devices
Running your impulse locallySee the Deployments tutorials to learn how to deploy your impulse to a variety of platforms.