Object tracking

Object Tracking is a new postprocessing layer that allows you to track bounding boxes across inference runs, turning raw bounding boxes into stable “tracked” detections. This can significantly reduce jitter and provide continuity of object labels across frames.

Object tracking

Enterprise preview

This feature is in preview mode and available for Enterprise customers only. Interested in also enabling object counting? Contact your Customer Success Manager (CSM).

Enabling object tracking

To enable Object Tracking:

  1. Go to Administrative zone > Object tracking in your Edge Impulse project.

  2. Enable the Object Tracking feature.

Enable object tracking

Once enabled, all deployments (Linux, Mobile, EON Compiler, C++ library, WebAssembly, etc.) that contain an Object Detection block will automatically include the object tracking postprocessing. By default, it uses a set of standard parameters (described in detail below) that you can adjust to fit your use case.

Configuring object tracking thresholds

Object tracking Configuration - mobile client

Linux CLI / mobile client

When you have built a model that includes Object Tracking postprocessing, you can dynamically configure tracking thresholds via:

Linux CLI

Use edge-impulse-linux-runner --model-file <model.eim>.

  • The runner’s interactive console (and web UI via http://localhost:PORT) now includes configurable tracking thresholds.

Mobile client

If you’re running your impulse in the Edge Impulse mobile client, you can configure thresholds in the UI as well (preview).

Note

We are actively working on adding these thresholds into the Studio UI. In the meantime, using the Linux CLI or Mobile Client is the easiest way to experiment with threshold settings.

Example threshold parameters

In the Linux runner UI, you will see fields such as:

  • Keep grace: How many frames an object is kept if it disappears.

  • Max observations: The maximum number of observations to match for stable tracking.

  • IoU threshold: Intersection over Union threshold to decide if a bounding box is the same object.

  • Use IoU: Whether to use IoU-based matching.

Pass thresholds via command line (linux)

For Linux EIM deployments, you can now directly pass thresholds to the runner:

edge-impulse-linux-runner \
  --model-file ./model-with-object-tracking.eim \
  --object-tracking-keep-grace 5 \
  --object-tracking-max-observations 5 \
  --object-tracking-iou-threshold 0.5 \
  --object-tracking-use-iou true

Node.js SDK

In the Node.js SDK, there is a new function to set these thresholds at runtime:

// classifier is an instance of EdgeImpulseClassifier
classifier.setLearnBlockThreshold({
  keepGrace: 5,
  maxObservations: 5,
  iouThreshold: 0.5,
  useIoU: true
});

Code deployments (C++ / EON)

For C++ library or EON Compiler deployments, you can configure thresholds in model-parameters/model_variables.h (name may vary based on your project’s generated files). A typical configuration might look like:

const ei_object_tracking_config_t ei_posprocessing_config_9 = {
  1,       /* implementation_version */
  5,       /* keep_grace */
  5,       /* max_observations */
  0.5000f, /* iou_threshold */
  true     /* use_iou */
};

Adjust these parameters to suit your use case (e.g., shorten or extend “keep grace,” or change your IoU threshold to handle occlusion scenarios, etc.).

Comparing object tracking vs. standard object detection

A simple way to see the difference between raw bounding boxes and tracked bounding boxes:

Standard object detection vs object tracking

Terminal 1:

PORT=9200 edge-impulse-linux-runner --model-file ./model-with-object-tracking.eim

Terminal 2:

PORT=9201 edge-impulse-linux-runner --model-file ./model-without-object-tracking.eim

Open http://localhost:9200 and http://localhost:9201 in two separate browser windows and observe the difference in bounding box stability. You’ll see smoother, more persistent bounding boxes with object tracking enabled.

Accessing tracked objects in the inference output

C++ libraries

ei_impulse_result_t result;

// ... run inference ...

for (uint32_t i = 0; i < result.postprocessed_output.object_tracking_output.open_traces_count; i++) {
  ei_object_tracking_trace_t trace = result.postprocessed_output.object_tracking_output.open_traces[i];

  // Access tracked object data:
  // trace.id, trace.label, trace.x, trace.y, trace.width, trace.height
}

WebAssembly

let result = classifier.classify(/* frame or image data */);
console.log(result.object_tracking_results);

EIM files

When reading inference metadata from an EIM file, look under the object_tracking field to retrieve tracked objects.

Advanced usage

Looking for a more complex example? Check out the Model Cascade approach, which chains together an Object Tracking model with an LLM (e.g., GPT-4):

Troubleshooting

If you encounter any issues with object tracking, please reach out to your solutions engineer for assistance.

Last updated

Was this helpful?

Revision created

api outlinking