BrainChip AKD1000
Last updated
Was this helpful?
Last updated
Was this helpful?
The can be plugged into a developer’s existing linux system to unlock capabilities for a wide array of edge AI applications, including Smart City, Smart Health, Smart Home and Smart Transportation. Linux machines with the AKD1000 are supported by Edge Impulse so that you can sample raw data, build models, and deploy trained embedded machine learning models directly from the Edge Impulse studio to create the next generation of low-power, high-performance ML applications.
To learn more about BrainChip technology please visit BrainChip's website:
To enable this device for Edge Impulse deployments you must install the following dependencies on your Linux target that has an Akida PCIe board attached.
With all software set up, connect your camera or microphone to your operating system and run:
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean
.
The output from this Block is an .eim file that, one saved, can be run with the following command:
We have multiple projects that are available to clone immediately to quickly train and deploy models for the AKD1000.
If you have an image model then you can get a peek of what your device sees by being on the same network as your device, and finding the 'Want to see a feed of the camera and live classification in your browser' message in the console. Open the URL in a browser and both the camera feed and the classification are shown:
akida
library)It is mainly related to initialization of the Akida™ NSoC and model and is could be caused by lack of Akida Python libraries. Please check if you have an Akida™ Python library installed:
Example output:
If you don't have the library (WARNING: Package(s) not found: akida
) then install it:
If you have the library, then check if the EIM artifact is looking for the library in the correct place. First, download your EIM model using Edge Impulse Linux CLI tools:
Then run the EIM model with debug
option:
Now check if your Location
directory from pip show akida
command is listed in your sys.path
output. If not (usually it happens if you are using Python virtual environments), then export PYTHONPATH
:
And try to run the model with edge-impulse-linux-runner
once again.
If the previous step didn't help, try to get additional debug data. With your EIM model downloaded, open one terminal window and do:
Then in another terminal:
This should give you additional info in the first terminal about the possible root of your issue.
This error could mean that your camera is in use by another process. Check if you don't have any application open that is using the camera. This error could all exists when your previous attempt to run edge-impulse-linux-runner
failed with exception. In that case, check if you have a gst-launch-1.0
process running. For example:
In this case, the first number (here 5615
) is a process ID. Kill the process:
And try to run the model with edge-impulse-linux-runner
once again.
: Python 3.8 is required for deployments via the or because the binary file that is generated is reliant on specific paths generated for the combination of Python 3.8 and Python Akida™ Library 2.2.2 installations. Alternatively, if you intend to write your own code with the or the via the option you may use Python 3.7 - 3.10.
: A python package for quick and easy model development, testing, simulation, and deployment for BrainChip devices
: This will build and install the driver on your system to communicate with the above AKD1000 reference PCIe board
: This will enable you to connect your development system directly to Edge Impulse Studio
That's all! Your machine is now connected to Edge Impulse. To verify this, go to , and click Devices. The device will be listed here.
After adding data via starting an you can add BrainChip Akida™ . The type of Learning Blocks visible depend on the type of data collected. Using BrainChip Akida™ Learning Blocks will ensure that models generated for deployment will be compatible with BrainChip Akida™ devices.
In the of the Impulse Design one can compare between Float, Quantized, and Akida™ versions of a model. If you added a to your you will need to generate features before you can train your model. If the project uses a you may be able to select a base model from to transfer learn from. More models will be available in the future, but if you have a specific request please let us know via the .
In order to achieve full hardware acceleration models must be converted from their original format to run on an AKD1000. This can be done by selecting the BrainChip MetaTF Block from the Deployment Screen. This will generate a .zip file with models that can be used in your application for the AKD1000. The block uses the to convert quantized models to SNN models compatible for the AKD1000. One can then develop an application using the Akida™ python package that will call the Akida™ formatted model found inside the .zip file.
Alternatively, you can use the AKD1000 Block to generate a that can be used by the to run on your Linux installation with a AKD1000 Mini PCIe present.