My setup with the Intel Wireless card installed and antennas connected.
RTSP streaming success on the SK-TDA4VM, with a Seeed Studio reComputer 1010 as the RTSP server
USB camera pointing towards the controller and RTSP server running on Seeed Studio reComputer.
Video > Take Snapshot
to take a screenshot of my controller. I would use my smartphone to change the temperature in various zones, which would cause those zone lights to illuminate red on the controller board. I took about a hundred photos with various lights illuminated (and off). Ideally you would have several hundred images but this was a good start. Once I had those images collected, it was time to start up the Edge Impulse Studio and start labelling data. I elected to do an object detection model since I wanted to identify the number of zones illuminated (and hence on), so labelling consisted of created bounding boxes around the illuminated zones (and power LED).
Ingested images with labels and bounding boxes applied
Training with the TI fork of YoloX for my model.
Results with the test data looked good.
params.yaml
file that can scale the input:
50ms inference at \~20 fps over an RTSP stream, not bad!
rtsp_src_example.yaml
, as well as the model contents and configurations. I made some small changes to the post_process.py
python file as well.
Overall, this was a good test of learning new concepts for development. From getting the RTSP server up and running, to creating custom blocks, to ensuring the model inputs aligned, I came across several obstacles that I didn’t anticipate. However, the end result was a solid demonstration of the SK-TDA4VM board when paired with an Edge Impulse object detection model.