The AKD1000-powered PCIe boards can be plugged into a developer’s existing linux system to unlock capabilities for a wide array of edge AI applications, including Smart City, Smart Health, Smart Home and Smart Transportation. Linux machines with the AKD1000 are supported by Edge Impulse so that you can sample raw data, build models, and deploy trained embedded machine learning models directly from the Edge Impulse studio to create the next generation of low-power, high-performance ML applications.
To learn more about BrainChip technology please visit BrainChip's website: https://brainchip.com/products/
To enable this device for Edge Impulse deployments you must install the following dependencies on your Linux target that has an Akida PCIe board attached.
Python 3.8: Python 3.8 is required for deployments via the [Edge Impulse CL/tools/edge-impulse-for-linux/README.md.md) or AKD1000 deployment blocks because the binary file that is generated is reliant on specific paths generated for the combination of Python 3.8 and Python Akida™ Library 2.3.3 installations. Alternatively, if you intend to write your own code with the Python Akida™ Library or the Edge Impulse SDK via the BrainChip MetaTF Deployment Block option you may use Python 3.7 - 3.10.
Python Akida™ Library 2.3.3: A python package for quick and easy model development, testing, simulation, and deployment for BrainChip devices
Akida™ PCIe drivers: This will build and install the driver on your system to communicate with the above AKD1000 reference PCIe board
Edge Impulse Linux: This will enable you to connect your development system directly to Edge Impulse Studio
With all software set up, connect your camera or microphone to your operating system and run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
That's all! Your machine is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.
After adding data via Data acquisition starting an Impulse Design you can add BrainChip Akida™ Learning Block. The type of Learning Blocks visible depend on the type of data collected. Using BrainChip Akida™ Learning Blocks will ensure that models generated for deployment will be compatible with BrainChip Akida™ devices.
In the Learning Block of the Impulse Design one can compare between Float, Quantized, and Akida™ versions of a model. If you added a Processing Block to your Impulse Design you will need to generate features before you can train your model. If the project uses a transfer learning block you may be able to select a base model from BrainChip’s Model zoo to transfer learn from. More models will be available in the future, but if you have a specific request please let us know via the Edge Impulse forums.
In order to achieve full hardware acceleration models must be converted from their original format to run on an AKD1000. This can be done by selecting the BrainChip MetaTF Block from the Deployment Screen. This will generate a .zip file with models that can be used in your application for the AKD1000. The block uses the CNN2SNN toolkit to convert quantized models to SNN models compatible for the AKD1000. One can then develop an application using the Akida™ python package that will call the Akida™ formatted model found inside the .zip file.
Alternatively, you can use the AKD1000 Block to generate a pre-built binary that can be used by the Edge Impulse Linux CLI to run on your Linux installation with a AKD1000 Mini PCIe present.
The output from this Block is an .eim file that, once saved onto the computer containing the AKD1000, can be run with the following command:
Alternatively one can use CLI to build, download, and run the model on your x86 or aarch64 devices with this command format
The AKD1000 has a unique ability to conduct training on the edge device. This means that new classification features can be added or completely replace the existing classes in a model. A model must be specifically configured and compiled with MetaTF to access the ability of the AKD1000. To enable the Edge Learning features in Edge Impulse Studio please follow these steps:
Select a BrainChip Akida™ Learning Block in your Impulse design
In the Impulse design of the learning block, enable Create Edge Learning model under Akida Edge Learning options
Set the Additional classes and Number of neurons for each class and train the model. For more information about these parameters please visit BrainChip's documentation of the parameters. Note that Edge Learning compatible models require a specific setup for the feature extractor and classification head of the model. You can view how a model is configured by switching to Keras (expert) mode in the Neural Network settings and searching for "Feature Extractor" and "Build edge learning compatible model" comments in the Keras code.
Once the model is trained you may download the Edge Learning compatible model from either the project's Dashboard or the BrainChip MetaTF Model deployment block.
A public project with Edge Learning options is available in the Public Projects section of this documentation. To learn more about BrainChip's Edge Learning features and to find examples of its usage please visit BrainChip's documentation for Edge Learning.
We have multiple projects that are available to clone immediately to quickly train and deploy models for the AKD1000.
If you have an image model then you can get a peek of what your device sees by being on the same network as your device, and finding the 'Want to see a feed of the camera and live classification in your browser' message in the console. Open the URL in a browser and both the camera feed and the classification are shown:
akida
library)It is mainly related to initialization of the Akida™ NSoC and model and is could be caused by lack of Akida Python libraries. Please check if you have an Akida™ Python library installed:
Example output:
If you don't have the library (WARNING: Package(s) not found: akida
) then install it:
If you have the library, then check if the EIM artifact is looking for the library in the correct place. First, download your EIM model using Edge Impulse Linux CLI tools:
Then run the EIM model with debug
option:
Now check if your Location
directory from pip show akida
command is listed in your sys.path
output. If not (usually it happens if you are using Python virtual environments), then export PYTHONPATH
:
And try to run the model with edge-impulse-linux-runner
once again.
If the previous step didn't help, try to get additional debug data. With your EIM model downloaded, open one terminal window and do:
Then in another terminal:
This should give you additional info in the first terminal about the possible root of your issue.
This error could mean that your camera is in use by another process. Check if you don't have any application open that is using the camera. This error could all exists when your previous attempt to run edge-impulse-linux-runner
failed with exception. In that case, check if you have a gst-launch-1.0
process running. For example:
In this case, the first number (here 5615
) is a process ID. Kill the process:
And try to run the model with edge-impulse-linux-runner
once again.