Texas Instruments SK-AM68A

The The SK-AM68 Starter Kit/Evaluation Module (EVM) is based on the AM68x vision SoC which includes an image signal processor (ISP) supporting up to 480MP/s, an 8 tera-operations-per-second (TOPS) AI accelerator, two 64-bit Arm® Cortex®-A72 CPUs, and support for H.264/H.265 video encode/decode. The SK-AM68x is an ideal choice for machine vision, traffic monitoring, retail automation, and factory automation.

In order to take full advantage of the AM68A's AI hardware acceleration Edge Impulse has integrated TI Deep Learning Library and AM68A optimized Texas Instruments EdgeAI Model Zoo for low-to-no-code training and deployments from Edge Impulse Studio.

1. Installing dependencies

First, one needs to follow the AM68A Quick Start Guide to install the Linux distribution to the SD card of the device.

Edge Impulse supports PSDK 8.06.

To set this device up in Edge Impulse, run the following commands on the SK-AM68A:

npm config set user root && sudo npm install edge-impulse-linux -g --unsafe-perm

2. Connecting to Edge Impulse

With all software set up, connect your camera or microphone to your operating system (see 'Next steps' further on this page if you want to connect a different sensor), and run:


This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean.

3. Verifying that your device is connected

That's all! Your machine is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.

4. Next steps: building a machine learning model

With everything set up you can now build your first machine learning model with these tutorials:

Looking to connect different sensors? Our Linux SDK lets you easily send data from any sensor and any programming language (with examples in Node.js, Python, Go and C++) into Edge Impulse.

5. Deploying back to device

To run your impulse locally run on your Linux platform:


This will automatically compile your model with full hardware acceleration, download the model to your local machine, and then start classifying. Our Linux SDK has examples on how to integrate the model with your favourite programming language.

Image model?

If you have an image model then you can get a peek of what your device sees by being on the same network as your device, and finding the 'Want to see a feed of the camera and live classification in your browser' message in the console. Open the URL in a browser and both the camera feed and the classification are shown:

Projects that can run on the AM68A!

Some of these projects were first developed for the TDA4VM, but will run on the AM68A as well!

Optimized Models for the AM68A

Texas Instruments provides models that are optimized to run on the AM68A. Those that have Edge Impulse support are found in the links below. Each Github repository has instructions on installation to your Edge Impulse project. The original source of these optimized models are found at Texas Instruments EdgeAI Model Zoo.

  • Texas Instruments MobileNetV2+SSDLite - Please contact Edge Impulse Support

  • Texas Instruments RegNetX800MF+FPN+SSDLite - Please contact Edge Impulse Support

  • Texas Instruments YOLOV5 - Please contact Edge Impulse Support


The following guides will work for the SK-AM68A as well:

Last updated