Arduino - fire and no fire
Arduino TinyML Kit
Project flow
fire
and safe_environment
. That being said, it sounded like an easy task.
First, I created an Arduino sketch for the Arduino Nano 33 BLE Sense. The Arduino code records the room temperature using the onboard HTS221 temperature sensor and prints it via UART. Afterwards, the code captures an image using the OV7675 camera module. While working with the OV767X library, I realized that the code takes a very long time to capture an image. In this case, I modified the nano_33ble_sense_camera.ino
camera example from Edge Impulse’s Arduino library deployment to capture an image. The Edge Impulse’s Arduino camera code for the OV7675 has a custom driver that makes it faster to get image data from the camera. After an image has been captured, it is then encoded to base64. For this, I utilized the open-source Edge Impulse’s Arduino Nano 33 BLE Sense firmware. From the code, I used parts of the take_snapshot
function to take a snapshot, encode it as base64 and print it to UART. With this code, the Arduino Nano 33 BLE Sense constantly samples a snapshot and temperature value, and they are then printed via UART (Serial). Note that it is not a good idea to send Strings via Serial due to memory leaks, but in this case I worked with Strings. The image width and height can be controlled with the variables WIDTH
and HEIGHT
respectively; the default image size is 240x240 pixels. Note that increasing the image dimensions will increase the time that the Arduino board will take to capture an image and also for the Python script to decode the base64 data and save it to a .JPG image.
Arduino Nano 33 BLE Sense data collection logs
number_of_samples_to_collect
.
To use the Arduino code and Python scripts to create a dataset, we first upload the Arduino code to an Arduino Nano 33 BLE Sense. Once the code is uploaded, we identify the COM port of the board and update the SERIAL_PORT
variable accordingly in the Python script. Install the Python libraries on your computer using pip install -r requirements.txt
and finally run the Python script with the command python read_and_save_serial_data.py
. The Python script will automatically process the serial data, save photos as .JPG, and store temperature values in a .CSV file. The images are numerically numbered, and their file names are also put in the .CSV file, the same row as the temperature recorded at the moment the photo was taken.
Since fire is dangerous and difficult to control, I used an oven and a candle to collect data. The oven generates temperatures higher than the ambient room temperatures and this can be detected by a temperature sensor. The candles gives a flame which can be optically detected by a camera. Therefore, both sensors compliment each other. I secured the Arduino TinyML kit on a tripod stand and faced it to an oven. For the safe environment (safe_environment class), I had the oven switched off and the candle was not lit. In total, I collected 60 images and 60 temperature values that were ranging between 23 and 27 degrees Celsius. The images below show photos of how the Arduino board was placed next to an oven, an image that was captured and also the .CSV file with the temperature values and the class label.
Arduino board sampling safe environment data
Safe image and temperature values
dataset_class
variable to "fire"
and this makes the Python script save the images and .CSV file to a folder named fire
. Since the HTS221 is only guaranteed to operate over a temperature range from -40 to +120 degrees Celsius, I did not put the Arduino board inside the oven to prevent overheating and damaging the board. In this case, the board recorded temperatures of 60 to 70 degrees Celsius while it was next to the oven, placed on the oven door.
Arduino board sampling environment data
Fire image and temperature values
fire
class. I replaced half of the images and temperature values with the one’s obtained in the safe
environment class. In this case, the model was able to better understand the relationship between the two inputs. A fire can be seen but the temperature value being recorded is as low as 20 degrees. In this case, the temperature sensor may not be within a good range of the fire. At the same time in an environment with fire, no flames may be seen but the temperatures may be as high as 70 degrees. The flame may not be detected by the camera but the high temperature can be felt.
git clone
command) to load a custom dataset folder on your Google Drive.
Google Colab load dataset
fire
and safe_environment
folders to a pandas data frame, defines the image parameters, and loads the images.
Google Colab define variables
Arduino IDE flash overflow
Google Colab model architecture
Tensor slicing
ei.API_KEY
variable in the notebook.
Edge Impulse API keys
safe_environment
class is in a safe_environment_test folder, and the test data for the fire
class is in a fire_test folder.
After training the model, the model achieved a validation accuracy of 100%. However, this does not imply that the model is perfect! In this case, the features that the model was classifying are simple, only 50 epochs were used, and the dataset had 120 images and 120 temperature values for training. To improve the model, we can add more data, update the model architecture and and increase the number of training cycles. For this demonstration however, I determined this is acceptable.
Model testing
Model performance
Edge Impulse Dashboard
fire, safe_environment
. Click “Save model” to finish the configuration.
Edge Impulse upload model
input_features.txt
which can be seen in the notebook files. We can copy the contents of the text file and paste them in the Edge Impulse project to test our model on the platform. In the screenshot below, we can see that the model classified the features to belong to the fire
class and this was the correct classification.
Edge Impulse model testing
Edge Impulse model optimizations
EI_CAMERA_RAW_FRAME_BUFFER_COLS
, EI_CAMERA_RAW_FRAME_BUFFER_ROWS
, WIDTH
, and HEIGHT
have the same image dimensions as the one used in the model training. Finally, we can upload the inference Sketch on the Arduino Nano 33 BLE Sense. Once the code is uploaded, the Arduino board will record the room temperature, capture an image, and then classify if the environment is safe or has a fire. The inference Sketch follows a similar operation as the data collection code. The main difference in this case is that the data are not printed to Serial. In fact, the inference Sketch is also built from the nano_33ble_sense_camera.ino
example code. I updated the code to also get temperature value and then in the function ei_camera_cutout_get_data
, we can append the temperature value to the buffer that will afterwards be passed to the classifier which in this case is our multi-input model.
Arduino inference logs
fire
class and the onboard RGB LED turned to red.
Inference demo on oven
Inference safe environment
fire
class and the Arduino’s onboard RGB LED turned red.
Inference fire environment