Texas Instruments SK-TDA4VM

The SK-TDA4VM is a Linux enabled development kit from Texas Instruments with a focus on smart cameras, robots, and ADAS that need multiple connectivity options and ML acceleration. The TDA4VM processor has 8 TOPS of hardware-accelerated AI combined with low power capabilities to make this device capable of many applications.

In order to take full advantage of the TDA4VM's AI hardware acceleration Edge Impulse has integrated TI Deep Learning Library and TDA4VM optimized EdgeAI models for low-to-no-code training and deployments from Edge Impulse Studio.

SK-TDA4VM

1. Installing dependencies

First, one needs to follow the TDA4VM Getting Started Guide to install the Linux distribution to the SD card of the device.

To set this device up in Edge Impulse, run the following commands on the SK-TDA4VM:

npm config set user root && sudo npm install edge-impulse-linux -g --unsafe-perm

2. Connecting to Edge Impulse

With all software set up, connect your camera or microphone to your operating system (see 'Next steps' further on this page if you want to connect a different sensor), and run:

edge-impulse-linux

This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean.

3. Verifying that your device is connected

That's all! Your machine is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.

Device connected to Edge Impulse.

4. Next steps: building a machine learning model

With everything set up you can now build your first machine learning model with these tutorials:

5. Deploying back to device

To run your impulse locally run on your Linux platform:

edge-impulse-linux-runner

This will automatically compile your model with full hardware acceleration, download the model to your local machine, and then start classifying. Our Linux SDK has examples on how to integrate the model with your favourite programming language.

Image model?

If you have an image model then you can get a peek of what your device sees by being on the same network as your device, and finding the 'Want to see a feed of the camera and live classification in your browser' message in the console. Open the URL in a browser and both the camera feed and the classification are shown:

Live feed with classification results

Optimized Models for the TDA4VM

Texas Instruments provides number of models that are optimized to run on the TDA4VM. Those that have Edge Impulse support are found in the links below. Each Github repository has instructions on installation to your Edge Impulse project. The original source of these optimized models are found at Texas Instruments EdgeAI Model Zoo.

Last updated

Revision created

removed development board to board