Skip to main content
Deploy Edge Impulse models on Android using Android NDK and TensorFlow Lite. This series covers everything from basic inference to application focused examples for data collection, with camera, audio, and motion sensors.

Android Studio - Object Tracking and Detection - live debugging as demonstrated in the QNN example repository

Quick Start

git clone https://github.com/edgeimpulse/example-android-inferencing.git
cd example-android-inferencing

Repositories

Tutorials in this series (more coming soon!)

Prerequisites

  • Edge Impulse Account: Sign up
  • Trained Model: Complete a tutorial first
  • Android Studio: Download (Ladybug 2024.2.2 or later)
  • Tools: Android API 35, NDK 27.0.12077973, CMake 3.22.1

Tested Model Types

  • Vision: FOMO, Object Detection, Visual Anomaly Detection
  • Audio: Keyword Spotting (KWS)
  • Motion: Accelerometer, Gyroscope classification

Common Workflow

All tutorials follow this pattern:
  1. Export model from Studio → Deployment → Android (C++ library)
  2. Download TFLite libraries:
    cd app/src/main/cpp/tflite
    sh download_tflite_libs.sh  # or .bat for Windows
    
  3. Copy model files to app/src/main/cpp/ (skip CMakeLists.txt)
  4. Update test features in native-lib.cpp
  5. Build and run in Android Studio

Platform Support

ArchitectureStatusNotes
arm64-v8a (64-bit)RecommendedAll modern devices
armeabi-v7a (32-bit)Requires configOlder devices, see 32-bit steps in the android examples README
Android Versions: Minimum API 24, Target API 35

Performance Optimization

  • XNNPACK: CPU acceleration (included by default)
  • QNN: Qualcomm NPU acceleration (see QNN tutorial)

Resources

Need Help?