What You’ll Build
A basic Android app that:- Loads a C++ model via Android NDK
- Runs inference on static test features
- Displays classification results
Difficulty: Beginner
Prerequisites
- Trained Edge Impulse model
- Android Studio with NDK and CMake installed
- Basic familiarity with Android development
Step 1: Clone the Repository
Step 2: Download TensorFlow Lite Libraries
Step 3: Export Your Model
- In Edge Impulse Studio, go to Deployment
- Select Android (C++ library)
- Click Build and download the
.zip
Step 4: Integrate the Model
- Extract the downloaded
.zipfile - Copy all files except
CMakeLists.txtto:
Step 5: Add Test Features
- In Studio, go to Model testing
- Click on a test sample
- Copy the raw features
- Paste into
native-lib.cpp:
Step 6: Build and Run
- Open the project in Android Studio
- Build → Make Project
- Run on a device or emulator
Understanding the Code
Native Inference (C++)
Java/Kotlin Bridge
Troubleshooting
Build fails with 'undefined reference to run_classifier'
Build fails with 'undefined reference to run_classifier'
Cause: Model files not copied correctlySolution:
- Ensure all folders (edge-impulse-sdk, model-parameters, tflite-model) are in
app/src/main/cpp/ - Don’t replace the existing
CMakeLists.txt
App crashes on launch
App crashes on launch
Cause: Native library not loadedSolution:
- Verify
System.loadLibrary("test_cpp")matches your library name - Check Build output for compilation errors
Wrong classification results
Wrong classification results
Cause: Test features don’t match model inputSolution:
- Copy features from Studio’s Model Testing page
- Ensure feature count matches model input size
- Check feature order (x, y, z for accelerometer, etc.)