Skip to main content
PI-SG565D is an intelligent ecological single-board computer developed by Quectel based on the high-performance 8-core 64-bit processor from Qualcomm QCS6490 (with computing power of up to 12.15 TOPS) and the Qualcomm Adreno™ 643L GPU. It has 8 GB LPDDR4X memory, adopts USB Type-C power supply interface, and can be externally connected to eMMC and SSD. It also supports Wi-Fi 2.4 & 5 G, IEEE 802.11a/b/g/n/ac and Bluetooth 5.0, as well as dual displays (DP and LCM or DP and Micro HDMI). With strong performance and rich multimedia functions, it can meet your requirements for high data rate, multimedia functions and computing power in industrial and consumer applications. PI-SG565D integrates abundant interfaces, which significantly expands its application in M2M field. It can also be widely used in industries and devices such as edge computing, robotics, industrial control, multimedia terminals, digital billboards, intelligent security systems and industrial-grade PDA, covering various sectors across the entire AIoT field. PI-SG565D supports Yocto Linux/Debian operating systems, which can meet the requirements of algorithm prototype verification and inference application development.

Quectel PI-SG565D Single-Board Computer

Getting Started with your Quectel PI-SG565D on Edge Impulse

1. Following the Quick Start Guide for the Quectel PI-SG565D

Please follow the Quick Start Guide provided by Quectel to set up your PI-SG565D board. You will also need a USB webcam to work with images on Edge Impulse.

2. Installing the Edge Impulse Linux CLI

On the device install the Edge Impulse CLI and other dependencies via:
$ wget https://cdn.edgeimpulse.com/firmware/linux/setup-edge-impulse-qc-linux.sh
$ sh setup-edge-impulse-qc-linux.sh

3. Connecting to Edge Impulse

With all dependencies set up, run:
$ edge-impulse-linux
This will start a wizard which asks you to log in and choose an Edge Impulse project. If you want to switch projects, or use a different camera (e.g. a USB camera) run the command with the --clean argument.

4. Verifying that your device is connected

That’s all! Your device is now connected to Edge Impulse. To verify this, go to your Edge Impulse project, and click Devices. The device will be listed here.

RB3 Connected to Edge Impulse

Next steps: building a machine learning model

With everything set up you can now build your first machine learning model with these tutorials: Looking to connect different sensors? Our Linux SDK lets you easily send data from any sensor and any programming language (with examples in Node.js, Python, Go and C++) into Edge Impulse.

Profiling your models

To profile your models for the Qualcomm QCS6490 processor that is one the PI-SG565D board, follow these steps:
  • Make sure to select the Qualcomm Dragonwing RB3 Gen 2 Development Kit as your target device. You can change the target at the top of the page near your user’s logo. This kit has the same processor as the PI-SG565D board.
  • Head to your Learning block page in Edge Impulse Studio.
  • Click on the Calculate performance button.
To provide the on-device performance, we use Qualcomm AI Hub in the background (see the image below) which run the compiled model on a physical device to gather metrics such as the mapping of model layers to compute units, inference latency, and peak memory usage. See more on Qualcomm® AI Hub documentation page.

Qualcomm profiling using Qualcomm AI Hub

Deploying back to device

Using the Edge Impulse Linux CLI

To run your impulse locally on the RB3, open a terminal and run:
$ edge-impulse-linux-runner
This will automatically compile your model with full hardware acceleration, download the model to your RB3 Gen 2, and then start classifying (use --clean to switch projects). Alternatively, you can select the Linux (AARCH64 with Qualcomm QNN) option in the Deployment page.

Qualcomm deployment options

This will download an .eim model that you can run on your board with the following command:
edge-impulse-linux-runner --model-file downloaded-model.eim

Using the Edge Impulse Linux Inferencing SDKs

Our Linux SDK has examples on how to integrate the .eim model with your favourite programming language.
You can download either the quantized version and the float32 versions but Qualcomm NN accelerator only supports quantized models. If you select the float32 version, the model will run on CPU.

Using the IM SDK GStreamer option

When selecting this option, you will obtain a .zip folder. We provide instructions in the README.md file included in the compressed folder. See more information on Qualcomm IM SDK GStreamer pipeline.

Image model?

If you have an image model then you can get a peek of what your device sees by being on the same network as your device, and finding the ‘Want to see a feed of the camera and live classification in your browser’ message in the console. Open the URL in a browser and both the camera feed and the classification are shown:

Live feed with classification results

Troubleshooting

If you start the CLI, and see:
Failed to initialize linux tool Capture process failed with code 255
You’ll need to restart the camera server via:
$ systemctl restart cam-server
I