
Android Studio - Object Tracking and Detection - live debugging as demonstrated in the QNN example repository
Quick Start
Repositories
Static Buffer Inference
Run inference with test data (15 min)
Camera Inference
Object detection and anomaly detection (30 min)
Keyword Spotting
Audio classification with microphone (30 min)
WearOS Motion
IMU-based motion classification (45 min)
QNN Acceleration
Qualcomm NPU hardware acceleration (1 hour)
Tutorials in this series (more coming soon!)
Prerequisites
- Edge Impulse Account: Sign up
- Trained Model: Complete a tutorial first
- Android Studio: Download (Ladybug 2024.2.2 or later)
- Tools: Android API 35, NDK 27.0.12077973, CMake 3.22.1
Tested Model Types
- Vision: FOMO, Object Detection, Visual Anomaly Detection
- Audio: Keyword Spotting (KWS)
- Motion: Accelerometer, Gyroscope classification
Common Workflow
All tutorials follow this pattern:- Export model from Studio → Deployment → Android (C++ library)
- Download TFLite libraries:
- Copy model files to
app/src/main/cpp/(skip CMakeLists.txt) - Update test features in
native-lib.cpp - Build and run in Android Studio
Platform Support
| Architecture | Status | Notes |
|---|---|---|
| arm64-v8a (64-bit) | Recommended | All modern devices |
| armeabi-v7a (32-bit) | Requires config | Older devices, see 32-bit steps in the android examples README |
Performance Optimization
- XNNPACK: CPU acceleration (included by default)
- QNN: Qualcomm NPU acceleration (see QNN tutorial)