Linux aarch64
or Linux Arduino UNO Q (GPU)
. Then, import it on your UNO Q to run edge AI applications that use your custom model.

Arduino App Lab version 0.1.23
Prerequisites
Before you start, make sure you have completed the Arduino UNO Q setup using a monitor, keyboard, and mouse to enable WiFi and SSH access. Additionally, make sure you followed one of the following tutorials and have a trained impulse:- Sound recognition
- Keyword spotting
- Image classification
- Motion recognition with anomaly detection (only for boards with built-in IMU sensor)
- Object detection with centroids (FOMO)
Deploying an example application
The Arduino App Lab launches automatically when the UNO Q starts. If you don’t see it, go to Applications (top left corner), navigate to Accessories, and click on Arduino App Lab.Keyword spotting
First we are going to deploy theHey Arduino
application in the App Lab examples onto the Arduino UNO Q.

Hey Arduino keyword spotting application from Arduino App Lab version 0.1.23
Hey Arduino
application is a keyword spotting application designed to trigger an LED matrix animation (a heart shape animation) when the phrase “Hey Arduino” is detected by the microphone.
This application is interesting as it takes advantage of both the CPU and the MCU available on the Arduino UNO Q. The CPU handles the keyword spotting, and the onboard microcontroller runs the Sketch to visualize the LED matrix animations once the keyword has been recognized.

Hey Arduino app.yaml file from Arduino App Lab version 0.1.23
keyword_spotting
brick. A brick is like a code package that provides a specific functionality that is needed to the application. In this case, it adds the ability to detect sound patterns through a USB microphone and trigger an event when a match occurs.
The keyword_spotting
brick uses a pre-trained Edge Impulse audio classification model that identifies Hey Arduino
. It continuously monitors the audio and when it detects the keyword it triggers the microcontroller using the Bridge tool to activate the LED animation.
To deploy the application on your Arduino UNO Q click the green button in the top right corner. The App Launch starts deploying the application locally.

Hey Arduino application deploying to the Arduino UNO Q from the Arduino App Lab

Hey Arduino matrix animation on the Arduino UNO Q
Object detection
If you have connected a USB webcam to your Arduino UNO Q, you can test the applicationDetect objects on Camera
.
This application uses two bricks, the video_object_detection
and the WebUI - HTML
which hosts a web application and exposes APIs or Websockets to be used in the application.

Detect Object on Camera application from Arduino App Lab version 0.1.23
video_object_detection
brick to find objects on a live video feed from the camera.
To access the UI of the application, use the browser and go to the local IP address of your Arduino UNO Q using the port 7000
.

Testing the Video Generic Object Detection application after deploying the Detect Object on Camera application from Arduino App Lab into the Arduino UNO Q
Deploying a custom application
Now that you have tested the Arduino App Lab on your Arduino UNO Q, it’s time to create your own application using an impulse trained in your Edge Impulse account. We are going to replicate theDetect Objects on Camera
application and we will use a custom model, instead of the pre-trained model that comes with the brick. For this tutorial, we will use a face detection model.
Creating a new app
Click on “My Apps” in the left menu of the Arduino App Lab and then click on the top right hand corner button calledCreate new app +
.

My Apps section in Arduino App Lab version 0.1.23
video_object_detection
and web_ui
).

New application created with bricks in Arduino App Lab version 0.1.23
0.1.23
.
Copying the impulse
Next, you will need the.eim
file generated by Edge Impulse Studio for Linux aarch64
. You can either download the Edge Impulse model using the Edge Impulse Linux CLI tools.
You can manually copy the .eim
file to the following folder of the Arduino UNO Q or use VS Code (see below):
video_object_detection
brick will use it for inference, instead of the default pre-trained model.
Using VS Code
To copy the file from your local computer to the Arduino UNO Q using the VS Code, follow these instructions:- Connect to your Arduino UNO Q via SSH using the
Remote SSH
extension - Open the target folder
- Drag and drop the
.eim
file from your local computer into that folder

VS Code access to the Arduino UNO Q to develop the new application
Building the application
Go to your application in the folder:app.yaml
file from SSH using this new variable for the video_object_detection
brick.
EI_OBJ_DETECTION_MODEL
variable. You can update models for other bricks or use cases using these variables:
Copying additional assets
After you have saved your changes, follow the instructions below to copy the assets and webUI files from the example to your application and copy the main Python script.Detect objects on Camera
example that we used before, except it uses your own custom model trained with Edge Impulse Studio.
Running the application
To run the application, type:
New application deployed in the Arduino UNO Q via Arduino App Lab with an Edge Impulse custom model