
Introduction
If you work with Machine Learning, you know how much time it takes to collect data and train a model. But one of the cool things about the AI ecosystem is that you can find ready-made datasets and models that can help you speed up your projects. Edge Impulse has a new feature called Bring Your Own Model (BYOM) that lets you import, optimize and deploy a pre-trained model in any of these formats: TensorFlow SavedModel, ONNX, or TensorFlow Lite. You can then deploy and run projects with this model on many boards and devices. I decided to try out this feature by importing an American Sign Language model and deploying it to a Texas Instruments AM62A Development Kit. I also created a simple ASL training game in Python.American Sign Language Model

Model Import
From the original idea to the point where I actually started the project, the ASL Hugging Face model I wanted to use was no longer available. So at this point I decided to train a simple ASL model outside Edge Impulse, export the model as a TensorFlow Lite file, import to Edge Impulse, deploy the model to the Texas Instruments AM62A board, and develop a simple Python training game.Note: the signs used for training were not made by someone with experience in ASL, so please note that detection rates could be far from perfect.The first step was to take the pictures. I used a smartphone camera, and took at least 3 pictures for every sign, based on a standard ASL chart.



.zip
file download, which I then unzipped. Inside, I found 2 files: the model with the .tflite
extension, and a label .txt file. This label file has 2 columns: order and label name. To prepare everything for Edge Impulse BYOM, I removed the order column and compiled everything in one row, comma separated.
Example:

BYOM Procedure
If you don’t have an Edge Impulse account, you can create one for free. I logged in to my existing Edge Impulse account, and have created a new project named BYOM, though any name could be used. I then clicked “Upload Your Model”, and selected the Tensor Flow Lite file.


Deployment
The deployment procedure varies from board to board. In this case, the deployment will be made directly from the Texas Instruments AM62A command line, so it is not required to export a file from the Edge Impulse platform.Texas Instruments AM62A Setup and Deployment
- Download this operating system image version: https://www.ti.com/tool/download/PROCESSOR-SDK-LINUX-AM62A/08.06.00.45
- Flash the image to a 16gb or larger microSD card with Balena Etcher or any other similar software
- Connect the Power Supply, HDMI, USB Camera, and Ethernet Cable
- Check the board IP on the HDMI screen when board boots up and the default application loads
- Login to that IP using Putty or any other SSH client, using
root
as the user, and no password - Run
npm config set user root && sudo npm install edge-impulse-linux -g --unsafe-perm
- Run
pip3 install art
(a library to display bigger letters) - Run
edge-impulse-linux-runner
. The first time you run this, you will need to login to your Edge Impulse account and select the BYOM project. Once running, launch a web browser and navigate to your board’s IP address, port 4912. For example,http://192.168.1.66:4912
in my case. - Download the
am62a_signlanguage.py
file from the GitHub repository and upload the script to the AM62A board using SFTP. The credentials are the same as logging in directly: You’ll need your IP address, username isroot
, and there is no password. - Run
python3 am62a_signlanguage.py
Training Game
You will see a letter printed on screen, then you have to form the letter sign in front of the camera. The script will show an inference confidence percent, and the seconds until detection.

Demo Video
Conclusions
The ability to import a pre-trained model to the Edge Impulse platform is a very valuable feature. In a matter of minutes and using a simple and intuitive interface, the imported model could be deployed to one of the many available platforms like Arduino, BrainChip, Cube, Ethos, OpenMV, Linux, Mac, Sony’s Spresense, etc. It is important to note that Edge Impulse’s visualizations and tuning for external pre-trained models are not available, so you should check the model quality in advance.Resources
Files
- Source Code: https://github.com/ronibandini/ASLTrainer
- Edge Impulse Public Project: https://studio.edgeimpulse.com/public/270046/latest
- https://huggingface.co/ronibandini/AmericanSignLanguage (this model has all the letters)