LogoLogo
HomeDocsAPI & SDKsProjectsForumStudio
  • Welcome
    • Featured Machine Learning Projects
      • Getting Started with the Edge Impulse Nvidia TAO Pipeline - Renesas EK-RA8D1
      • Smart City Traffic Analysis - NVIDIA TAO + Jetson Orin Nano
      • ROS 2 Pick and Place System - Arduino Braccio++ Robotic Arm and Luxonis OAK-D
      • Optimize a cloud-based Visual Anomaly Detection Model for Edge Deployments
      • Rooftop Ice Detection with Things Network Visualization - Nvidia Omniverse Replicator
      • Surgery Inventory Object Detection - Synthetic Data - Nvidia Omniverse Replicator
      • NVIDIA Omniverse - Synthetic Data Generation For Edge Impulse Projects
      • Community Guide – Using Edge Impulse with Nvidia DeepStream
      • Computer Vision Object Counting - Avnet RZBoard V2L
      • Gesture Appliances Control with Pose Detection - BrainChip AKD1000
      • Counting for Inspection and Quality Control - Nvidia Jetson Nano (TensorRT)
      • High-resolution, High-speed Object Counting - Nvidia Jetson Nano (TensorRT)
    • Prototype and Concept Projects
      • Renesas CK-RA6M5 Cloud Kit - Getting Started with Machine Learning
      • TI CC1352P Launchpad - Getting Started with Machine Learning
      • OpenMV Cam RT1062 - Getting Started with Machine Learning
      • Getting Started with Edge Impulse Experiments
  • Computer Vision Projects
    • Workplace Organizer - Nvidia Jetson Nano
    • Recyclable Materials Sorter - Nvidia Jetson Nano
    • Analog Meter Reading - Arduino Nicla Vision
    • Creating Synthetic Data with Nvidia Omniverse Replicator
    • SonicSight AR - Sound Classification with Feedback on an Augmented Reality Display
    • Traffic Monitoring - Brainchip Akida
    • Multi-camera Video Stream Inference - Brainchip Akida
    • Industrial Inspection Line - Brainchip Akida
    • X-Ray Classification and Analysis - Brainchip Akida
    • Inventory Stock Tracker - FOMO - BrainChip Akida
    • Container Counting - Arduino Nicla Vision
    • Smart Smoke Alarm - Arduino Nano 33
    • Shield Bot Autonomous Security Robot
    • Cyclist Blind Spot Detection - Himax WE-I Plus
    • IV Drip Fluid-Level Monitoring - Arduino Portenta H7
    • Worker PPE Safety Monitoring - Nvidia Jetson Nano
    • Delivered Package Detection - ESP-EYE
    • Bean Leaf Disease Classification - Sony Spresense
    • Oil Tank Measurement Using Computer Vision - Sony Spresense
    • Object Counting for Smart Industries - Raspberry Pi
    • Smart Cashier with FOMO - Raspberry Pi
    • PCB Defect Detection with Computer Vision - Raspberry Pi
    • Bicycle Counting - Sony Spresense
    • Counting Eggs with Computer Vision - OpenMV Cam H7
    • Elevator Passenger Counting - Arduino Nicla Vision
    • ESD Protection using Computer Vision - Seeed ReComputer
    • Solar Panel Defect Detection - Arduino Portenta H7
    • Label Defect Detection - Raspberry Pi
    • Dials and Knob Monitoring with Computer Vision - Raspberry Pi
    • Digital Character Recognition on Electric Meter System - OpenMV Cam H7
    • Corrosion Detection with Computer Vision - Seeed reTerminal
    • Inventory Management with Computer Vision - Raspberry Pi
    • Monitoring Retail Checkout Lines with Computer Vision - Renesas RZ/V2L
    • Counting Retail Inventory with Computer Vision - Renesas RZ/V2L
    • Pose Detection - Renesas RZ/V2L
    • Product Quality Inspection - Renesas RZ/V2L
    • Smart Grocery Cart Using Computer Vision - OpenMV Cam H7
    • Driver Drowsiness Detection With FOMO - Arduino Nicla Vision
    • Gastroscopic Image Processing - OpenMV Cam H7
    • Pharmaceutical Pill Quality Control and Defect Detection
    • Deter Shoplifting with Computer Vision - Texas Instruments TDA4VM
    • Smart Factory Prototype - Texas Instruments TDA4VM
    • Correct Posture Detection and Enforcement - Texas Instruments TDA4VM
    • Visual Anomaly Detection with FOMO-AD - Texas Instruments TDA4VM
    • Surface Crack Detection and Localization - Texas Instruments TDA4VM
    • Surface Crack Detection - Seeed reTerminal
    • Retail Image Classification - Nvidia Jetson Nano
    • SiLabs xG24 Plus Arducam - Sorting Objects with Computer Vision and Robotics - Part 1
    • SiLabs xG24 Plus Arducam - Sorting Objects with Computer Vision and Robotics - Part 2
    • Object Detection and Visualization - Seeed Grove Vision AI Module
    • Bike Rearview Radar - Raspberry Pi
    • Build a Self-Driving RC Vehicle - Arduino Portenta H7 and Computer Vision
    • "Bring Your Own Model" Image Classifier for Wound Identification
    • Acute Lymphoblastic Leukemia Classifier - Nvidia Jetson Nano
    • Hardhat Detection in Industrial Settings - Alif Ensemble E7
    • Motorcycle Helmet Identification and Traffic Light Control - Texas Instruments AM62A
    • Import a Pretrained Model with "Bring Your Own Model" - Texas Instruments AM62A
    • Product Inspection with Visual Anomaly Detection - FOMO-AD - Sony Spresense
    • Visual Anomaly Detection in Fabric using FOMO-AD - Raspberry Pi 5
    • Car Detection and Tracking System for Toll Plazas - Raspberry Pi AI Kit
    • Visual Anomaly Detection - Seeed Grove Vision AI Module V2
    • Object Counting with FOMO - OpenMV Cam RT1062
    • Visitor Heatmap with FOMO Object Detection - Jetson Orin Nano
    • Vehicle Security Camera - Arduino Portenta H7
  • Audio Projects
    • Occupancy Sensing - SiLabs xG24
    • Smart Appliance Control Using Voice Commands - Nordic Thingy:53
    • Glass Window Break Detection - Nordic Thingy:53
    • Illegal Logging Detection - Nordic Thingy:53
    • Illegal Logging Detection - Syntiant TinyML
    • Wearable Cough Sensor and Monitoring - Arduino Nano 33 BLE Sense
    • Collect Data for Keyword Spotting - Raspberry Pi Pico
    • Voice-Activated LED Strip - Raspberry Pi Pico
    • Snoring Detection on a Smart Phone
    • Gunshot Audio Classification - Arduino Nano 33 + Portenta H7
    • AI-Powered Patient Assistance - Arduino Nano 33 BLE Sense
    • Acoustic Pipe Leakage Detection - Arduino Portenta H7
    • Location Identification using Sound - Syntiant TinyML
    • Environmental Noise Classification - Nordic Thingy:53
    • Running Faucet Detection - Seeed XIAO Sense + Blues Cellular
    • Vandalism Detection via Audio Classification - Arduino Nano 33 BLE Sense
    • Predictive Maintenance Using Audio Classification - Arduino Nano 33 BLE Sense
    • Porting an Audio Project from the SiLabs Thunderboard Sense 2 to xG24
    • Environmental Audio Monitoring Wearable - Syntiant TinyML - Part 1
    • Environmental Audio Monitoring Wearable - Syntiant TinyML - Part 2
    • Keyword Spotting - Nordic Thingy:53
    • Detecting Worker Accidents with Audio Classification - Syntiant TinyML
    • Snoring Detection with Syntiant NDP120 Neural Decision Processor - Arduino Nicla Voice
    • Recognize Voice Commands with the Particle Photon 2
    • Voice Controlled Power Plug with Syntiant NDP120 (Nicla Voice)
    • Determining Compressor State with Audio Classification - Avnet RaSynBoard
    • Developing a Voice-Activated Product with Edge Impulse's Synthetic Data Pipeline
    • Enhancing Worker Safety using Synthetic Audio to Create a Dog Bark Classifier
  • Predictive Maintenance and Defect Detection Projects
    • Predictive Maintenance - Nordic Thingy:91
    • Brushless DC Motor Anomaly Detection
    • Industrial Compressor Predictive Maintenance - Nordic Thingy:53
    • Anticipate Power Outages with Machine Learning - Arduino Nano 33 BLE Sense
    • Faulty Lithium-Ion Cell Identification in Battery Packs - Seeed Wio Terminal
    • Weight Scale Predictive Maintenance - Arduino Nano 33 BLE Sense
    • Fluid Leak Detection With a Flowmeter and AI - Seeed Wio Terminal
    • Pipeline Clog Detection with a Flowmeter and AI - Seeed Wio Terminal
    • Refrigerator Predictive Maintenance - Arduino Nano 33 BLE Sense
    • Motor Pump Predictive Maintenance - Infineon PSoC 6 WiFi-BT Pioneer Kit + CN0549
    • BrickML Demo Project - 3D Printer Anomaly Detection
    • Condition Monitoring - Syntiant TinyML Board
    • Predictive Maintenance - Commercial Printer - Sony Spresense + CommonSense
    • Vibration Classification with BrainChip's Akida
    • AI-driven Audio and Thermal HVAC Monitoring - SeeedStudio XIAO ESP32
  • Accelerometer and Activity Projects
    • Arduino x K-Way - Outdoor Activity Tracker
    • Arduino x K-Way - Gesture Recognition for Hiking
    • Arduino x K-Way - TinyML Fall Detection
    • Posture Detection for Worker Safety - SiLabs Thunderboard Sense 2
    • Hand Gesture Recognition - OpenMV Cam H7
    • Arduin-Row, a TinyML Rowing Machine Coach - Arduino Nicla Sense ME
    • Fall Detection using a Transformer Model – Arduino Giga R1 WiFi
    • Bluetooth Fall Detection - Arduino Nano 33 BLE Sense
    • Monitor Packages During Transit with AI - Arduino Nano 33 BLE Sense
    • Smart Baby Swing - Arduino Portenta H7
    • Warehouse Shipment Monitoring - SiLabs Thunderboard Sense 2
    • Gesture Recognition - Bangle.js Smartwatch
    • Gesture Recognition for Patient Communication - SiLabs Thunderboard Sense 2
    • Hospital Bed Occupancy Detection - Arduino Nano 33 BLE Sense
    • Porting a Posture Detection Project from the SiLabs Thunderboard Sense 2 to xG24
    • Porting a Gesture Recognition Project from the SiLabs Thunderboard Sense 2 to xG24
    • Continuous Gait Monitor (Anomaly Detection) - Nordic Thingy:53
    • Classifying Exercise Activities on a BangleJS Smartwatch
  • Air Quality and Environmental Projects
    • Arduino x K-Way - Environmental Asthma Risk Assessment
    • Gas Detection in the Oil and Gas Industry - Nordic Thingy:91
    • Smart HVAC System with a Sony Spresense
    • Smart HVAC System with an Arduino Nicla Vision
    • Indoor CO2 Level Estimation - Arduino Portenta H7
    • Harmful Gases Detection - Arduino Nano 33 BLE Sense
    • Fire Detection Using Sensor Fusion and TinyML - Arduino Nano 33 BLE Sense
    • AI-Assisted Monitoring of Dairy Manufacturing Conditions - Seeed XIAO ESP32C3
    • AI-Assisted Air Quality Monitoring - DFRobot Firebeetle ESP32
    • Air Quality Monitoring with Sipeed Longan Nano - RISC-V Gigadevice
    • Methane Monitoring in Mines - Silabs xG24 Dev Kit
    • Smart Building Ventilation with Environmental Sensor Fusion
    • Sensor Data Fusion with Spresense and CommonSense
    • Water Pollution Detection - Arduino Nano ESP32 + Ultrasonic Scan
    • Fire Detection Using Sensor Fusion - Arduino Nano 33 BLE Sense
  • Novel Sensor Projects
    • 8x8 ToF Gesture Classification - Arduino RP2040 Connect
    • Food Irradiation Dose Detection - DFRobot Beetle ESP32C3
    • Applying EEG Data to Machine Learning, Part 1
    • Applying EEG Data to Machine Learning, Part 2
    • Applying EEG Data to Machine Learning, Part 3
    • Liquid Classification with TinyML - Seeed Wio Terminal + TDS Sensor
    • AI-Assisted Pipeline Diagnostics and Inspection with mmWave Radar
    • Soil Quality Detection Using AI and LoRaWAN - Seeed Sensecap A1101
    • Smart Diaper Prototype - Arduino Nicla Sense ME
    • DIY Smart Glove with Flex Sensors
    • EdgeML Energy Monitoring - Particle Photon 2
    • Wearable for Monitoring Worker Stress using HR/HRV DSP Block - Arduino Portenta
  • Software Integration Demos
    • Azure Machine Learning with Kubernetes Compute and Edge Impulse
    • ROS2 + Edge Impulse, Part 1: Pub/Sub Node in Python
    • ROS2 + Edge Impulse, Part 2: MicroROS
    • Using Hugging Face Datasets in Edge Impulse
    • Using Hugging Face Image Classification Datasets with Edge Impulse
    • Edge Impulse API Usage Sample Application - Jetson Nano Trainer
    • MLOps with Edge Impulse and Azure IoT Edge
    • A Federated Approach to Train and Deploy Machine Learning Models
    • DIY Model Weight Update for Continuous AI Deployments
    • Automate the CI/CD Pipeline of your Models with Edge Impulse and GitHub Actions
    • Deploying Edge Impulse Models on ZEDEDA Cloud Devices
Powered by GitBook
On this page
  • Description
  • Step 1: Designing and printing a Hulk-inspired structure
  • Step 1.1: Assembling the structure and making connections & adjustments
  • Step 2: Developing a web application in PHP to collate data on food irradiation doses
  • Step 3: Setting up a LAMP web server on Raspberry Pi
  • Step 3.1: Creating a MySQL database in MariaDB
  • Step 3.2: Setting and running the web application on Raspberry Pi
  • Step 4: Setting up Beetle ESP32-C3 on the Arduino IDE
  • Step 4.1: Displaying images on the SSD1309 transparent OLED screen
  • Step 5: Collecting and storing food irradiation data w/ Beetle ESP32-C3
  • Step 5.1: Logging the collected data into the MySQL database
  • Step 5.2: Creating samples from data records with the web application
  • Step 6: Building a neural network model with Edge Impulse
  • Step 6.1: Uploading samples to Edge Impulse
  • Step 6.2: Training the model on food irradiation dose levels
  • Step 6.3: Evaluating the model accuracy and deploying the model
  • Step 7: Setting up the Edge Impulse model on Beetle ESP32-C3
  • Step 8: Running the model on Beetle ESP32-C3 to make predictions on food irradiation doses
  • Videos and Conclusion
  • Further Discussions
  • References

Was this helpful?

Edit on GitHub
Export as PDF
  1. Novel Sensor Projects

Food Irradiation Dose Detection - DFRobot Beetle ESP32C3

Building a food irradiation detection device using a DFRobot ESP32, Geiger Counter, and visible light sensor.

Previous8x8 ToF Gesture Classification - Arduino RP2040 ConnectNextApplying EEG Data to Machine Learning, Part 1

Last updated 1 year ago

Was this helpful?

Created By: Kutluhan Aktar

Public Project Link:

Description

Even though food irradiation improves food hygiene, spoilage reduction, and extension of shelf-life, it should be regulated strictly to avoid any health risks and nutritional value drops. However, small businesses in the food industry lack a budget-friendly and simple way to detect food irradiation doses after treating food with ionizing energy, especially for animal (livestock) feed. Therefore, I decided to build an AI-driven IoT device predicting food irradiation doses based on weight, color (visible light), and emitted ionizing radiation.

Ionizing radiation is a nonthermal process utilized to achieve the preservation of food. At a maximum commercial irradiation dose of 10 kGy, irradiation does not impart heat to the food, and the nutritional quality of the food is generally unaffected. The irradiation process can reduce the microbial contamination of food, resulting in improved microbial safety as well as the extended shelf-life of the food[^1]. Irradiation also benefits the consumer by reducing the risk of severe health issues caused by foodborne illnesses. Food irradiation has three categories: low-dose (radurization), medium-dose (radicidation), and high-dose (radappertization). Low dose irradiation (under 1 kGy) inhibits the sprouting of produce (onion, potato, and garlic); retards the ripening and fungi deterioration of fruits and vegetables (strawberry, tomato, etc.), and promotes insect disinfestations in cereals and vegetables. Medium dose irradiation (between 1 and 10 kGy) controls the presence of pathogenic organisms, especially in fruit juices; retards the deterioration of fish and fresh meat; and reduces Salmonella in poultry products, similar to pasteurization. High dose irradiation (over 10 kGy) is rather significant to the sterilization of health and personal hygiene products[^2].

Since foods treated with ionizing radiation should be adequately labeled under the general labeling requirements, consumers can make their own free choice between irradiated and non-irradiated food. However, unfortunately, some countries do not apply strict regulations for irradiated foods, especially for animal feed. Therefore, detecting proper irradiation doses can be arduous for small businesses in the food industry due to governments not incentivizing strictly regulated food irradiation processes. Since irradiation can engender certain alterations that can modify the chemical composition and nutritive values of food, depending on the factors such as irradiation dose, food composition, packaging, and processing conditions such as temperature and atmospheric oxygen saturation[^2], unsupervised food irradiation portends health issues.

After scrutinizing recent research papers on food irradiation, I decided to utilize ionizing radiation, weight, and visible light (color) measurements denoting the applied irradiation dose so as to create a budget-friendly and accessible device to predict food irradiation dose levels in the hope of assisting small businesses in checking compliance with existing regulations on food irradiation.

Although ionizing radiation, weight, and visible light (color) measurements provide insight into detecting food irradiation doses, it is not possible to conclude and interpret food irradiation doses precisely by merely employing limited data without applying complex algorithms since food irradiation dose levels fluctuate depending on processing techniques, food characteristics, and equipment. Therefore, I decided to build and train an artificial neural network model by utilizing the theoretically assigned food irradiation dose classes to predict food irradiation dose levels based on ionizing radiation, weight, and visible light (color) measurements. Since I could not apply ionizing radiation directly to foods by emitting Gamma rays, X-rays, or electron beams, I exposed foods to sun rays as a natural source of radiation for estimated periods.

Since Beetle ESP32-C3 is an ultra-small size development board intended for IoT applications, that can easily collect data and run my neural network model after being trained to predict food irradiation doses, I decided to employ Beetle ESP32-C3 in this project. To obtain the required measurements to train my model, I utilized a Geiger counter module (Gravity), an I2C weight sensor (Gravity), and an AS7341 11-channel visible light sensor (Gravity). Since Beetle ESP32-C3 is equipped with an expansion board providing the GDI display interface, I connected an SSD1309 OLED transparent screen (Fermion) to display the collected data.

After collecting data successfully, I developed a PHP web application that obtains the transmitted data from Beetle ESP32-C3 via HTTP GET requests, logs the received measurements in a given MySQL database table, and lets the user create appropriately formatted samples for Edge Impulse.

After completing my data set and creating samples, I built my artificial neural network model (ANN) with Edge Impulse to make predictions on food irradiation dose levels (classes) based on ionizing radiation, weight, and visible light (color) measurements. Since Edge Impulse is nearly compatible with all microcontrollers and development boards, I had not encountered any issues while uploading and running my model on Beetle ESP32-C3. As labels, I employed the theoretically assigned food irradiation dose classes for each data record while collecting and logging data:

  • Regulated

  • Unsafe

  • Hazardous

After training and testing my neural network model, I deployed and uploaded the model on Beetle ESP32-C3. Therefore, the device is capable of detecting precise food irradiation dose levels (classes) by running the model independently without any additional procedures.

Lastly, to make the device as robust and compact as possible while experimenting with a motley collection of foods, I designed a Hulk-inspired structure with a movable visible light sensor handle (3D printable).

So, this is my project in a nutshell 😃

In the following steps, you can find more detailed information on coding, logging data via a web application, building a neural network model with Edge Impulse, and running it on Beetle ESP32-C3.

Step 1: Designing and printing a Hulk-inspired structure

Since this project is for detecting irradiation doses of foods treated with ionizing radiation, I got inspired by the most prominent fictional Gamma radiation expert, Bruce Banner (aka, The Incredible Hulk), to design a unique structure so as to create a robust and compact device flawlessly operating while collecting data from foods. To collect data with the visible light sensor at different angles, I added a movable handle to the structure, including a slot and a hook for hanging the sensor.

I designed the structure and its movable handle in Autodesk Fusion 360. You can download their STL files below.

For the Hulk replica affixed to the top of the structure, I utilized this model from Thingiverse:

Then, I sliced all 3D models (STL files) in Ultimaker Cura.

Since I wanted to create a solid structure for this device with a movable handle and complement the Hulk theme gloriously, I utilized these PLA filaments:

  • eMarble Natural

  • Peak Green

Finally, I printed all parts (models) with my Creality CR-200B 3D Printer. It is my first fully-enclosed FDM 3D printer, and I must say that I got excellent prints effortlessly with the CR-200B :)

If you are a maker planning to print your 3D models to create more complex projects, I highly recommend the CR-200B. Since the CR-200B is fully-enclosed, you can print high-resolution 3D models with PLA and ABS filaments. Also, it has a smart filament runout sensor and the resume printing option for power failures.

According to my experience, there are only two downsides of the CR-200B: relatively small build size (200 x 200 x 200 mm) and manual leveling. Conversely, thanks to the large leveling nuts and assisted leveling, I was able to level the bed and start printing my first model in less than 30 minutes.

Step 1.1: Assembling the structure and making connections & adjustments

// Connections
// Beetle ESP32-C3 : 
//                                Gravity: Geiger Counter Module
// D5   --------------------------- D
// VCC  --------------------------- +
// GND  --------------------------- -
//                                Gravity: I2C 1Kg Weight Sensor Kit - HX711
// VCC  --------------------------- VCC
// GND  --------------------------- GND
// D9   --------------------------- SCL
// D8   --------------------------- SDA
//                                Fermion: 1.51” SSD1309 OLED Transparent Display
// D4   --------------------------- SCLK
// D6   --------------------------- MOSI
// D7   --------------------------- CS
// D2   --------------------------- RES
// D1   --------------------------- DC
//                                AS7341 11-Channel Spectral Color Sensor
// VCC  --------------------------- +
// GND  --------------------------- -
// D9   --------------------------- C
// D8   --------------------------- D
//                                Control Button (A)
// D0   --------------------------- +
//                                Control Button (B)
// D20  --------------------------- +
//                                Control Button (C)
// D21  --------------------------- +

Since the Geiger counter library needs to use an external interrupt pin for counting, the Geiger counter module can only be connected to external interrupt pins. Plausibly, Beetle ESP32-C3 allows the user to define any pin as an external interrupt.

To assign labels while transmitting the collected data and run my neural network model effortlessly, I added three control buttons (6x6), as shown in the schematic below.

After completing sensor connections and adjustments on breadboards successfully, I made the breadboard connection points rigid by utilizing a hot glue gun.

After printing all parts (models), I fastened all components except the visible light sensor to their corresponding slots on the structure via the hot glue gun.

Then, I attached the visible light sensor to the movable handle and hung it via its slot in the structure.

Finally, I affixed the Hulk replica to the top of the structure via the hot glue gun.

Step 2: Developing a web application in PHP to collate data on food irradiation doses

To be able to log and process data packets transmitted by Beetle ESP32-C3, I decided to develop a web application in PHP named food_irradiation_data_logger.

As shown below, the web application consists of two folders and five files:

  • /assets

    • class.php

    • icon.png

    • index.css

  • /data

  • get_data.php

  • index.php

I also employed the web application to scale (normalize) and preprocess my data set so as to create appropriately formatted samples for Edge Impulse.

If the data type is not time series, Edge Impulse requires a CSV file with a header indicating data fields per sample to upload data with CSV files. Since Edge Impulse can infer the uploaded sample's label from its file name, the application reads the given data set in the MySQL database and generate a CSV file (sample) for each data record, named according to the assigned food irradiation dose class. Also, the application utilizes the unique row number under the id data field as the sample number to identify each generated CSV file:

  • Regulated.training.sample_101.csv

  • Unsafe.training.sample_542.csv

  • Hazardous.training.sample_152.csv

You can download and inspect the web application in the ZIP file format below.

📁 class.php

In the class.php file, in order to run all functions successfully, I created two classes named _main and sample: the latter inherits from the former.

	public function __init__($conn, $table){
		$this->conn = $conn;
		$this->table = $table;
	}
	public function insert_new_data($d1, $d2, $d3, $d4, $d5, $d6, $d7, $d8, $d9, $d10, $d11, $d12, $c){
		$sql = "INSERT INTO `$this->table`(`weight`, `f1`, `f2`, `f3`, `f4`, `f5`, `f6`, `f7`, `f8`, `cpm`, `nsv`, `usv`, `class`) VALUES ('$d1', '$d2', '$d3', '$d4', '$d5', '$d6', '$d7', '$d8', '$d9', '$d10', '$d11', '$d12', '$c')";
		if(mysqli_query($this->conn, $sql)){ return true; }else { return false; }
	}
	public function database_create_table(){
		// Create a new database table.
		$sql_create = "CREATE TABLE `$this->table`(		
							id int AUTO_INCREMENT PRIMARY KEY NOT NULL,
							weight varchar(255) NOT NULL,
							f1 varchar(255) NOT NULL,
							f2 varchar(255) NOT NULL,
							f3 varchar(255) NOT NULL,
							f4 varchar(255) NOT NULL,
							f5 varchar(255) NOT NULL,
							f6 varchar(255) NOT NULL,
							f7 varchar(255) NOT NULL,
							f8 varchar(255) NOT NULL,
							cpm varchar(255) NOT NULL,
							nsv varchar(255) NOT NULL,
							usv varchar(255) NOT NULL,
							`class` varchar(255) NOT NULL
					   );";
		if(mysqli_query($this->conn, $sql_create)) echo("<br><br>Database Table Created Successfully!");
	}
	public $class_names = ["Regulated", "Unsafe", "Hazardous"];
	
	// Count the registered data records (samples) in the given database table. 
	public function count_samples(){
		$count = [
			"total" => mysqli_num_rows(mysqli_query($this->conn, "SELECT * FROM `$this->table`")),
			"regulated" => mysqli_num_rows(mysqli_query($this->conn, "SELECT * FROM `$this->table` WHERE class='0'")),
			"unsafe" => mysqli_num_rows(mysqli_query($this->conn, "SELECT * FROM `$this->table` WHERE class='1'")),
			"hazardous" => mysqli_num_rows(mysqli_query($this->conn, "SELECT * FROM `$this->table` WHERE class='2'")),
		];
		return $count;
	}

[15.877, 0.25, 0.76, 0.57, 0.8, 1.89, 2.85, 4.65, 3.63, 0.8, 5.31, 0.53]

	public function create_sample_files($type){
		// Obtain the registered data records (samples) from the given database table.
		$sql = "SELECT * FROM `$this->table`";
		$result = mysqli_query($this->conn, $sql);
		$check = mysqli_num_rows($result);
		if($check > 0){
			while($row = mysqli_fetch_assoc($result)){
				// Scale (normalize) data items to define appropriately formatted inputs (samples).
				$scaled = [
					"weight" => $row["weight"] / 10,
					"f1" => $row["f1"] / 100,
					"f2" => $row["f2"] / 100,
					"f3" => $row["f3"] / 100,
					"f4" => $row["f4"] / 100,
					"f5" => $row["f5"] / 100,
					"f6" => $row["f6"] / 100,
					"f7" => $row["f7"] / 100,
					"f8" => $row["f8"] / 100,
					"cpm" => $row["cpm"] / 100,
					"nsv" => $row["nsv"] / 100,
					"usv" => $row["usv"]
				];
				// Add the header as the first row.
				$processed_data = [
					['weight','f1','f2','f3','f4','f5','f6','f7','f8','cpm','nsv','usv'],
					[$scaled["weight"],$scaled["f1"],$scaled["f2"],$scaled["f3"],$scaled["f4"],$scaled["f5"],$scaled["f6"],$scaled["f7"],$scaled["f8"],$scaled["cpm"],$scaled["nsv"],$scaled["usv"]]
				];
				$filename = "data/".$this->class_names[$row["class"]].".".$type.".sample_".$row["id"].".csv";
				$f = fopen($filename, "w");
				foreach($processed_data as $r){
					fputcsv($f, $r);
				}
				fclose($f);
			}
		}
	}
	public function download_samples($zipname){
		if(count(scandir("data")) > 2){
			$zip = new ZipArchive;
			$zip->open($zipname, ZipArchive::CREATE);
			foreach(glob("data/*.csv") as $sample){
				$zip->addFile($sample);
			}
			$zip->close();

			header('Content-Type: application/zip');
			header("Content-Disposition: attachment; filename='$zipname'");
			header('Content-Length: ' . filesize($zipname));
			header("Location: $zipname");
		}else{
			header("Location: .");
			exit();
		}
	}
$server = array(
	"name" => "localhost",
	"username" => "root",
	"password" => "bot",
	"database" => "foodirradiation",
	"table" => "entries"

);

$conn = mysqli_connect($server["name"], $server["username"], $server["password"], $server["database"]);

📁 get_data.php

include_once "assets/class.php";

// Define the new 'food' object:
$food = new _main();
$food->__init__($conn, $server["table"]); 
if(isset($_GET["weight"]) && isset($_GET["F1"]) && isset($_GET["F2"]) && isset($_GET["F3"]) && isset($_GET["F4"]) && isset($_GET["F5"]) && isset($_GET["F6"]) && isset($_GET["F7"]) && isset($_GET["F8"]) && isset($_GET["CPM"]) && isset($_GET["nSv"]) && isset($_GET["uSv"]) && isset($_GET["class"])){
	if($food->insert_new_data($_GET["weight"], $_GET["F1"], $_GET["F2"], $_GET["F3"], $_GET["F4"], $_GET["F5"], $_GET["F6"], $_GET["F7"], $_GET["F8"], $_GET["CPM"], $_GET["nSv"], $_GET["uSv"], $_GET["class"])){
		echo("Data received and saved successfully!");
	}else{
		echo("Database error!");
	}
}else{
	echo("Waiting Data...");
}
if(isset($_GET["create_table"]) && $_GET["create_table"] == "OK") $food->database_create_table();

📁 index.php

	include_once "assets/class.php";
	
	// Define the new 'sample' object: 
	$sample = new sample();
	$sample->__init__($conn, $server["table"]);
$count = $sample->count_samples();
    if(isset($_POST["data"]) && $_POST["data"] != ""){
		$sample->create_sample_files($_POST["data"]);
	}
    if(isset($_GET["download"])){
		$sample->download_samples("data.zip");
	}	

Step 3: Setting up a LAMP web server on Raspberry Pi

Since I decided to host my web application on a Raspberry Pi 3, I needed to set up a LAMP web server.

sudo apt-get install apache2 -y

hostname -I

sudo apt-get update

sudo apt-get install php -y

sudo apt install php7.3-zip

  • upload_max_filesize

  • max_file_uploads

sudo service apache2 restart

Step 3.1: Creating a MySQL database in MariaDB

Since I needed to log measurements transmitted by Beetle ESP32-C3 so as to create appropriately formatted samples for Edge Impulse, I also set up a MariaDB server on Raspberry Pi 3.

sudo apt-get install mariadb-server php-mysql -y

sudo mysql_secure_installation

sudo mysql -uroot -p

create database foodirradiation;

GRANT ALL PRIVILEGES ON foodirradiation.* TO 'root'@'localhost' IDENTIFIED BY 'bot';

FLUSH PRIVILEGES;

Step 3.2: Setting and running the web application on Raspberry Pi

As discussed above, I set up a LAMP web server on my Raspberry Pi 3 to run the web application, but you can run it on any server as long as it is a PHP server.

sudo mv /home/pi/Downloads/food_irradiation_data_logger /var/www/html/

sudo chmod -R 777 /var/www/html/food_irradiation_data_logger

💻 On the get_data.php file:

localhost/food_irradiation_data_logger/get_data.php

💻 On the index.php file:

Step 4: Setting up Beetle ESP32-C3 on the Arduino IDE

Before proceeding with the following steps, I needed to set up Beetle ESP32-C3 on the Arduino IDE and install the required libraries for this project.

If your computer cannot recognize Beetle ESP32-C3 when plugged in via a USB cable, connect Pin 9 to GND (pull-down) and try again.

https://raw.githubusercontent.com/espressif/arduino-esp32/gh-pages/package_esp32_index.json

Step 4.1: Displaying images on the SSD1309 transparent OLED screen

To display images (monochrome) on the SSD1309 transparent OLED screen successfully, I needed to convert PNG or JPG files into the XBM (X Bitmap Graphic) file format.

    u8g2.firstPage();  
    do{
      //u8g2.setBitmapMode(true /* transparent*/);
      u8g2.drawXBMP( /* x=*/36 , /* y=*/0 , /* width=*/50 , /* height=*/50 , data_colllect_bits);
    }while(u8g2.nextPage());

Step 5: Collecting and storing food irradiation data w/ Beetle ESP32-C3

After setting up Beetle ESP32-C3 and installing the required libraries, I programmed Beetle ESP32-C3 to collect ionizing radiation, weight, and visible light (color) measurements in order to store them on the MySQL database and create appropriately formatted samples for Edge Impulse.

  • CPM (Counts per Minute)

  • nSv/h (nanoSieverts per hour)

  • μSv/h (microSieverts per hour)

  • Weight (g)

  • F1 (405 - 425 nm)

  • F2 (435 - 455 nm)

  • F3 (470 - 490 nm)

  • F4 (505 - 525 nm)

  • F5 (545 - 565 nm)

  • F6 (580 - 600 nm)

  • F7 (620 - 640 nm)

  • F8 (670 - 690 nm)

Since I needed to assign food irradiation dose levels (classes) theoretically as labels for each data record while collecting data from foods to create a valid data set, I utilized the control buttons attached to Beetle ESP32-C3 so as to choose among irradiation dose classes. After selecting an irradiation dose class, Beetle ESP32-C3 appends the selected class to the collected data and then transmits that data packet to the web application.

  • Control Button (A) ➡ Regulated

  • Control Button (B) ➡ Unsafe

  • Control Button (C) ➡ Hazardous

You can download the IoT_food_irradiation_data_collect.ino file to try and inspect the code for collecting ionizing radiation, weight, and visible light (color) measurements and for transferring information to a given web application.

#include <WiFi.h>
#include <DFRobot_Geiger.h>
#include <DFRobot_HX711_I2C.h>
#include <U8g2lib.h>
#include <SPI.h>
#include "DFRobot_AS7341.h"
char ssid[] = "<_SSID_>";        // your network SSID (name)
char pass[] = "<_PASSWORD_>";    // your network password (use for WPA, or use as key for WEP)
int keyIndex = 0;                // your network key Index number (needed only for WEP)

// Define the server (Raspberry Pi).
char server[] = "192.168.1.20";
// Define the web application path.
String application = "/food_irradiation_data_logger/get_data.php";

// Initialize the WiFi client library.
WiFiClient client; /* WiFiSSLClient client; */
DFRobot_Geiger geiger(5);

// Define the HX711 weight sensor.
DFRobot_HX711_I2C MyScale;

// Define the AS7341 object.
DFRobot_AS7341 as7341;
// Define AS7341 data objects:
DFRobot_AS7341::sModeOneData_t data1;
DFRobot_AS7341::sModeTwoData_t data2;
#define OLED_DC  1
#define OLED_CS  7
#define OLED_RST 2

U8G2_SSD1309_128X64_NONAME2_1_4W_HW_SPI u8g2(/* rotation=*/U8G2_R0, /* cs=*/ OLED_CS, /* dc=*/ OLED_DC,/* reset=*/OLED_RST);
  u8g2.begin();
  u8g2.setFontPosTop();
  //u8g2.setDrawColor(0);
void err_msg(){
  // Show the error message on the SSD1309 transparent display.
  u8g2.firstPage();  
  do{
    //u8g2.setBitmapMode(true /* transparent*/);
    u8g2.drawXBMP( /* x=*/44 , /* y=*/0 , /* width=*/40 , /* height=*/40 , error_bits);
    u8g2.setFont(u8g2_font_4x6_tr);
    u8g2.drawStr(0, 47, "Check the serial monitor to see");
    u8g2.drawStr(40, 55, "the error!");
  }while(u8g2.nextPage());
}
  while (!MyScale.begin()) {
    Serial.println("HX711 initialization is failed!");
    err_msg();
    delay(1000);
  }
  Serial.println("HX711 initialization is successful!");
  MyScale.setCalWeight(100);
  // Set the calibration threshold (g).
  MyScale.setThreshold(30);
  // Display the current calibration value. 
  Serial.print("\nCalibration Value: "); Serial.println(MyScale.getCalibration());
  MyScale.setCalibration(MyScale.getCalibration());
  delay(1000);
  while (as7341.begin() != 0) {
    Serial.println("AS7341 initialization is failed!");
    err_msg();
    delay(1000);
  }
  Serial.println("AS7341 initialization is successful!");

  // Enable the built-in LED on the AS7341 sensor.
  as7341.enableLed(true);
  WiFi.begin(ssid, pass);
  // Attempt to connect to the WiFi network:
  while(WiFi.status() != WL_CONNECTED){
    // Wait for the connection:
    delay(500);
    Serial.print(".");
  }
  // If connected to the network successfully:
  Serial.println("Connected to the WiFi network successfully!");
  u8g2.firstPage();  
  do{
    u8g2.setFont(u8g2_font_open_iconic_all_8x_t);
    u8g2.drawGlyph(/* x=*/32, /* y=*/0, /* encoding=*/247);  
  }while(u8g2.nextPage());
  delay(2000);
void get_Weight(){
  weight = MyScale.readWeight();
  if(weight < 0.5) weight = 0;
  Serial.print("\nWeight: "); Serial.print(weight); Serial.println(" g");
  delay(1000);
}
  • eF1F4ClearNIR

  • eF5F8ClearNIR

void get_Visual_Light(){
  // Start spectrum measurement:
  // Channel mapping mode: 1.eF1F4ClearNIR
  as7341.startMeasure(as7341.eF1F4ClearNIR);
  // Read the value of sensor data channel 0~5, under eF1F4ClearNIR
  data1 = as7341.readSpectralDataOne();
  // Channel mapping mode: 2.eF5F8ClearNIR
  as7341.startMeasure(as7341.eF5F8ClearNIR);
  // Read the value of sensor data channel 0~5, under eF5F8ClearNIR
  data2 = as7341.readSpectralDataTwo();
  // Print data:
  Serial.print("\nF1(405-425nm): "); Serial.println(data1.ADF1);
  Serial.print("F2(435-455nm): "); Serial.println(data1.ADF2);
  Serial.print("F3(470-490nm): "); Serial.println(data1.ADF3);
  Serial.print("F4(505-525nm): "); Serial.println(data1.ADF4);
  Serial.print("F5(545-565nm): "); Serial.println(data2.ADF5);
  Serial.print("F6(580-600nm): "); Serial.println(data2.ADF6);
  Serial.print("F7(620-640nm): "); Serial.println(data2.ADF7);
  Serial.print("F8(670-690nm): "); Serial.println(data2.ADF8);
  // CLEAR and NIR:
  Serial.print("Clear_1: "); Serial.println(data1.ADCLEAR);
  Serial.print("NIR_1: "); Serial.println(data1.ADNIR);
  Serial.print("Clear_2: "); Serial.println(data2.ADCLEAR);
  Serial.print("NIR_2: "); Serial.println(data2.ADNIR);
  delay(1000);
}
void activate_Geiger_counter(){
  // Initialize the Geiger counter module and enable the external interrupt.
  geiger.start();
  delay(3000);
  // If necessary, pause the count and turn off the external interrupt trigger.
  geiger.pause();
  
  // Evaluate the current CPM (Counts per Minute) by dropping the edge pulse within 3 seconds: the error is ±3CPM.
  Serial.print("\nCPM: "); Serial.println(geiger.getCPM());
  // Get the current nSv/h (nanoSieverts per hour).
  Serial.print("nSv/h: "); Serial.println(geiger.getnSvh());
  // Get the current μSv/h (microSieverts per hour).
  Serial.print("μSv/h: "); Serial.println(geiger.getuSvh());
}
void drawNumber(int x, int y, int __){
    char buf[7];
    u8g2.drawStr(x, y, itoa(__, buf, 10));
}
void home_screen(int y, int x, int s){
  u8g2.firstPage();  
  do{
    u8g2.setFont(u8g2_font_open_iconic_all_2x_t);
    u8g2.drawGlyph(/* x=*/0, /* y=*/y-3, /* encoding=*/142);
    u8g2.drawGlyph(/* x=*/0, /* y=*/y+s-3, /* encoding=*/259);
    u8g2.drawGlyph(/* x=*/0, /* y=*/y+(2*s)-3, /* encoding=*/280);
    u8g2.setFont(u8g2_font_freedoomr10_mu);
    u8g2.drawStr(25, y, "WEIGHT:"); drawNumber(x, y, weight);
    u8g2.drawStr(25, y+s, "F1:"); drawNumber(x, y+s, data1.ADF1);
    u8g2.drawStr(25, y+(2*s), "CPM:"); drawNumber(x, y+(2*s), geiger.getCPM());
  }while(u8g2.nextPage());
}
void make_a_get_request(String _class){
  // Connect to the web application named food_irradiation_data_logger. Change '80' with '443' if you are using SSL connection.
  if (client.connect(server, 80)){
    // If successful:
    Serial.println("\nConnected to the web application successfully!");
    // Create the query string:
    String query = application+"?weight="+String(weight)+"&F1="+data1.ADF1+"&F2="+data1.ADF2+"&F3="+data1.ADF3+"&F4="+data1.ADF4+"&F5="+data2.ADF5+"&F6="+data2.ADF6+"&F7="+data2.ADF7+"&F8="+data2.ADF8;
    query += "&CPM="+String(geiger.getCPM())+"&nSv="+String(geiger.getnSvh())+"&uSv="+String(geiger.getuSvh());
    query += "&class="+_class;
    // Make an HTTP Get request:
    client.println("GET " + query + " HTTP/1.1");
    client.println("Host: 192.168.1.20");
    client.println("Connection: close");
    client.println();
  }else{
    Serial.println("\nConnection failed to the web application!");
    err_msg();
  }
  delay(2000); // Wait 2 seconds after connecting...
  // If there are incoming bytes available, get the response from the web application.
  String response = "";
  while (client.available()) { char c = client.read(); response += c; }
  if(response != "" && response.indexOf("Data received and saved successfully!") > 0){
    Serial.println("Data registered successfully!");
    u8g2.firstPage();  
    do{
      //u8g2.setBitmapMode(true /* transparent*/);
      u8g2.drawXBMP( /* x=*/36 , /* y=*/0 , /* width=*/50 , /* height=*/50 , data_colllect_bits);
      u8g2.setFont(u8g2_font_4x6_tr);
      u8g2.drawStr(6, 55, "Data registered successfully!");
    }while(u8g2.nextPage());
  }
}
  if(!digitalRead(button_A)) make_a_get_request("0");
  if(!digitalRead(button_B)) make_a_get_request("1");
  if(!digitalRead(button_C)) make_a_get_request("2");

Step 5.1: Logging the collected data into the MySQL database

After uploading and running the code for collecting data and transmitting data packets to the web application on Beetle ESP32-C3:

  • WEIGHT (g)

  • F1 (405 - 425 nm)

  • CPM (Counts per Minute)

  • Control Button (A) ➡ Regulated [0]

  • Control Button (B) ➡ Unsafe [1]

  • Control Button (C) ➡ Hazardous [2]

As far as my experiments go, the device operates impeccably while collecting measurements and transmitting data packets to a given web application :)

Step 5.2: Creating samples from data records with the web application

After logging ionizing radiation, weight, and visible light (color) measurements in the MySQL database from a motley collection of foods, exposed to sun rays as a natural source of radiation for estimated periods, I elicited my data set with eminent validity.

📌Foods:

  • Pasta

  • Corn kernel

  • Herb

  • Apple

  • Wheat

  • Animal (livestock) feed

As explained in Step 2, I generated a CSV file (sample) for each data record in the MySQL database by utilizing the web application.

📌 Training samples:

📌 Testing samples:

Step 6: Building a neural network model with Edge Impulse

When I completed collating my food irradiation dose data set and assigning labels, I had started to work on my artificial neural network model (ANN) to make predictions on food irradiation dose levels (classes) based on ionizing radiation, weight, and visible light (color) measurements.

Since Edge Impulse supports almost every microcontroller and development board due to its model deployment options, I decided to utilize Edge Impulse to build my artificial neural network model. Also, Edge Impulse makes scaling embedded ML applications easier and faster for edge devices such as Beetle ESP32-C3.

Even though Edge Impulse supports CSV files to upload samples, the data type should be time series to upload all data records in a single file. Therefore, I needed to follow the steps below to format my data set so as to train my model accurately:

  • Data Scaling (Normalizing)

  • Data Preprocessing

As explained in the previous steps, I utilized the web application to scale (normalize) and preprocess data records to create CSV files (samples) for Edge Impulse.

Since the assigned classes are stored under the class data field in the MySQL database, I preprocessed my data set effortlessly to obtain labels for each data record while generating samples:

  • 0 — Regulated

  • 1 — Unsafe

  • 2 — Hazardous

Plausibly, Edge Impulse allows building predictive models optimized in size and accuracy automatically and deploying the trained model as an Arduino library. Therefore, after scaling (normalizing) and preprocessing my data set to create samples, I was able to build an accurate neural network model to forecast food irradiation dose levels and run it on Beetle ESP32-C3 effortlessly.

Step 6.1: Uploading samples to Edge Impulse

After generating training and testing samples successfully, I uploaded them to my project on Edge Impulse.

Step 6.2: Training the model on food irradiation dose levels

After uploading my training and testing samples successfully, I designed an impulse and trained it on food irradiation dose levels (classes).

An impulse is a custom neural network model in Edge Impulse. I created my impulse by employing the Raw Data block and the Classification learning block.

The Raw Data block generate windows from data samples without any specific signal processing.

The Classification learning block represents a Keras neural network model. Also, it lets the user change the model settings, architecture, and layers.

According to my experiments with my neural network model, I modified classification model settings, architecture, and layers to build a neural network model with high accuracy and validity:

📌 Neural network settings:

  • Number of training cycles ➡ 50

  • Learning level ➡ 0.0006

  • Validation set size ➡ 10

📌 Extra layers:

  • Dense layer (64 neurons)

  • Dense layer (32 neurons)

After generating features and training my model with training samples, Edge Impulse evaluated the precision score (accuracy) as 100%.

The precision score is approximately 100% due to the volume and variety of training samples. In technical terms, the model overfits the training data set. Therefore, I am still collecting data to improve my training data set.

Step 6.3: Evaluating the model accuracy and deploying the model

After building and training my neural network model, I tested its accuracy and validity by utilizing testing samples.

The evaluated accuracy of the model is 96.30%.

After validating my neural network model, I deployed it as a fully optimized and customizable Arduino library.

Step 7: Setting up the Edge Impulse model on Beetle ESP32-C3

After building, training, and deploying my model as an Arduino library on Edge Impulse, I needed to upload and run the Arduino library on Beetle ESP32-C3 directly so as to create an easy-to-use and capable device operating with minimal latency and power consumption.

Since Edge Impulse optimizes and formats signal processing, configuration, and learning blocks into a single package while deploying models as Arduino libraries, I was able to import my model effortlessly to run inferences.

#include <IoT_AI-driven_Food_Irradiation_Classifier_inferencing.h>

After importing my model successfully to the Arduino IDE, I employed the control button (B) attached to Beetle ESP32-C3 to run inferences so as to predict food irradiation dose levels:

  • Press ➡ Run Inference

You can download the IoT_food_irradiation_run_model.ino file to try and inspect the code for running Edge Impulse neural network models on Beetle ESP32-C3.

You can inspect the corresponding functions and settings in Step 5.

#include <DFRobot_Geiger.h>
#include <DFRobot_HX711_I2C.h>
#include <U8g2lib.h>
#include <SPI.h>
#include "DFRobot_AS7341.h"

// Include the Edge Impulse model converted to an Arduino library:
#include <IoT_AI-driven_Food_Irradiation_Classifier_inferencing.h>
#define FREQUENCY_HZ        EI_CLASSIFIER_FREQUENCY
#define INTERVAL_MS         (1000 / (FREQUENCY_HZ + 1))

// Define the features array to classify one frame of data.
float features[EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE];
size_t feature_ix = 0;
  • Regulated

  • Unsafe

  • Hazardous

float threshold = 0.60;

// Define the food irradiation dose (class) names:
String classes[] = {"Hazardous", "Regulated", "Unsafe"};
static const unsigned char *class_icons[] U8X8_PROGMEM = {hazardous_bits, regulated_bits, unsafe_bits};
void run_inference_to_make_predictions(int multiply){
  // Scale (normalize) data items depending on the given model:
  float scaled_weight = weight / 10;
  float scaled_F1 = data1.ADF1 / 100;
  float scaled_F2 = data1.ADF2 / 100;
  float scaled_F3 = data1.ADF3 / 100;
  float scaled_F4 = data1.ADF4 / 100;
  float scaled_F5 = data2.ADF5 / 100;
  float scaled_F6 = data2.ADF6 / 100;
  float scaled_F7 = data2.ADF7 / 100;
  float scaled_F8 = data2.ADF8 / 100;
  float scaled_CPM = geiger.getCPM() / 100;
  float scaled_nSv = geiger.getnSvh() / 100;
  float scaled_uSv = geiger.getuSvh();
  
  // Copy the scaled data items to the features buffer.
  // If required, multiply the scaled data items while copying them to the features buffer.
  for(int i=0; i<multiply; i++){  
    features[feature_ix++] = scaled_weight;
    features[feature_ix++] = scaled_F1;
    features[feature_ix++] = scaled_F2;
    features[feature_ix++] = scaled_F3;
    features[feature_ix++] = scaled_F4;
    features[feature_ix++] = scaled_F5;
    features[feature_ix++] = scaled_F6;
    features[feature_ix++] = scaled_F7;
    features[feature_ix++] = scaled_F8;
    features[feature_ix++] = scaled_CPM;
    features[feature_ix++] = scaled_nSv;
    features[feature_ix++] = scaled_uSv;
  }

  // Display the progress of copying data to the features buffer.
  Serial.print("\nFeatures Buffer Progress: "); Serial.print(feature_ix); Serial.print(" / "); Serial.println(EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE);
  
  // Run inference:
  if(feature_ix == EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE){    
    ei_impulse_result_t result;
    // Create a signal object from the features buffer (frame).
    signal_t signal;
    numpy::signal_from_buffer(features, EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE, &signal);
    // Run the classifier:
    EI_IMPULSE_ERROR res = run_classifier(&signal, &result, false);
    ei_printf("\nrun_classifier returned: %d\n", res);
    if(res != 0) return;

    // Print the inference timings on the serial monitor.
    ei_printf("Predictions (DSP: %d ms., Classification: %d ms., Anomaly: %d ms.): \n", 
        result.timing.dsp, result.timing.classification, result.timing.anomaly);

    // Obtain the prediction results for each label (class).
    for(size_t ix = 0; ix < EI_CLASSIFIER_LABEL_COUNT; ix++){
      // Print the prediction results on the serial monitor.
      ei_printf("%s:\t%.5f\n", result.classification[ix].label, result.classification[ix].value);
      // Get the predicted label (class).
      if(result.classification[ix].value >= threshold) predicted_class = ix;
    }
    Serial.print("\nPredicted Class: "); Serial.println(predicted_class);

    // Detect anomalies, if any:
    #if EI_CLASSIFIER_HAS_ANOMALY == 1
      ei_printf("Anomaly : \t%.3f\n", result.anomaly);
    #endif

    // Clear the features buffer (frame):
    feature_ix = 0;
  }
}
  if(!digitalRead(button_B)){
    model_activation = true;
    u8g2.firstPage();  
    do{
      u8g2.setFont(u8g2_font_open_iconic_all_8x_t);
      u8g2.drawGlyph(/* x=*/32, /* y=*/0, /* encoding=*/233);  
    }while(u8g2.nextPage());
  }
  while(model_activation){
    get_Weight();
    get_Visual_Light();
    activate_Geiger_counter();

    // Run inference:
    run_inference_to_make_predictions(1);

    // If the Edge Impulse model predicted a label (class) successfully:
    if(predicted_class != -1){
      // Display the predicted class:
      String c = "Class: " + classes[predicted_class];
      int str_x = c.length() * 4;
      u8g2.firstPage();  
      do{
        //u8g2.setBitmapMode(true /* transparent*/);
        u8g2.drawXBMP( /* x=*/(u8g2.getDisplayWidth()-50)/2 , /* y=*/0 , /* width=*/50 , /* height=*/50 , class_icons[predicted_class]);
        u8g2.setFont(u8g2_font_4x6_tr);
        u8g2.drawStr((u8g2.getDisplayWidth()-str_x)/2, 55, c.c_str());
      }while(u8g2.nextPage());      
      
      // Clear the predicted class (label).
      predicted_class = -1;

      // Stop the running inference and return to the home screen.
      model_activation = false;
    }
  }

Step 8: Running the model on Beetle ESP32-C3 to make predictions on food irradiation doses

When the features array (buffer) is full with data items, my Edge Impulse neural network model predicts possibilities of labels (food irradiation dose classes) for the given features buffer as an array of 3 numbers. They represent the model's "confidence" that the given features buffer corresponds to each of the three different food irradiation dose levels (classes) based on ionizing radiation, weight, and visible light (color) measurements [0 - 2], as shown in Step 6:

  • 0 — Regulated

  • 1 — Unsafe

  • 2 — Hazardous

After executing the IoT_food_irradiation_run_model.ino file on Beetle ESP32-C3:

  • WEIGHT (g)

  • F1 (405 - 425 nm)

  • CPM (Counts per Minute)

  • Regulated

  • Unsafe

  • Hazardous

As far as my experiments go, the device predicts food irradiation dose levels (classes) accurately by employing the collected measurements :)

Videos and Conclusion

After completing all steps above and experimenting, I have employed the device to predict and detect food irradiation dose levels of various foods and food packaging so as to check whether they conform to health and safety standards regarding food irradiation.

Further Discussions

By applying neural network models trained on ionizing radiation, weight, and visible light (color) measurements in detecting food irradiation dose levels, we can achieve to[^3]:

References

[^1] Vanee Komolprasert. "CHAPTER 6: PACKAGING FOR FOODS TREATED BY IONIZING RADIATION." Packaging for Nonthermal Processing of Food. Blackwell Publishing, First edition, 2007. 87 - 88.

[^2] Ana Paula Dionísio, Renata Takassugui Gomes, and Marília Oetterer. Ionizing Radiation Effects on Food Vitamins – A Review. Braz. Arch. Biol. Technol. v.52 n.5: pp. 1267-1278, Sept/Oct 2009

[^3] Kim M. Morehouse and Vanee Komolprasert. Overview of Irradiation of Food and Packaging. ACS Symposium Series 875, Irradiation of Food and Packaging, 2004, Chapter 1, Pages 1-11. https://www.fda.gov/food/irradiation-food-packaging/overview-irradiation-food-and-packaging.

Huge thanks to for sponsoring these products:

Beetle ESP32-C3 |

Gravity: Geiger Counter Module |

Gravity: I2C 1Kg Weight Sensor Kit |

Gravity: AS7341 11-Channel Visible Light Sensor |

Fermion: 1.51” OLED Transparent Display |

If you want to purchase products from DFRobot, you can use .

Also, huge thanks to for sending me a .

Before the first use, remove unnecessary cable ties and apply grease to the rails.

Test the nozzle and hot bed temperatures.

Go to Settings ➡ Leveling and adjust four predefined points by utilizing the leveling nuts.

Finally, attach the spool holder and feed the extruder with the filament.

Since the CR-200B is not officially supported by Cura, select the Ender-3 profile and change the build size to 200 x 200 x 200 mm. Also, to compensate for the nozzle placement, set the Nozzle offset X and Y values to -10 mm on the Extruder 1 tab.

First of all, I soldered male pin headers to and its expansion board.

Then, to collect ionizing radiation, weight, and color (visible light) measurements, I connected a Geiger counter module (Gravity), an I2C HX711 weight sensor (Gravity), and an AS7341 11-channel visible light sensor (Gravity) to Beetle ESP32-C3. Since the expansion board provides the GDI display interface for DFRobot screens, I was able to connect to Beetle ESP32-C3 via the expansion board.

After assembling , to calibrate the weight sensor in order to get accurate measurements, press the cal button on the adapter board. Then, wait for the indicator LED to turn on and place a 100 g (default value) object on the scale within 5 seconds. When the adapter board completes calibration, the indicator LED blinks three times.

Since Beetle ESP32-C3 cannot power and the weight sensor simultaneously due to its working current, I connected a USB buck-boost converter board to my Xiaomi power bank to elicit stable 3.3V to supply the sensors.

Define the _main class and its functions:

In the init function, define the required variables for the MySQL database.

In the insert_new_data function, append the given measurements and food irradiation dose class to the given database table.

In the database_create_table function, create the required database table.

Define the sample class, extending the _main class, and its functions:

Define the food irradiation dose class (label) names.

In the count_samples function, count the registered data records (samples) in the given database table.

In the create_sample_files function:

Obtain the registered data records from the given database table.

Scale (normalize) data items to define appropriately formatted inputs in the range of 0-1.

Define the header indicating data elements.

Create an array with the scaled data items.

For each data record, create a CSV file (sample) named with the assigned irradiation dose class and identified with the unique row number under the id data field.

Each sample includes twelve data items [shape=(12,)]:

In the download_samples function, download all generated CSV files (samples) in the ZIP file format.

Define the required MySQL database connection settings for Raspberry Pi.

Include the class.php file.

Define the food object of the _main class with its required parameters.

Obtain the transferred information from Beetle ESP32-C3.

Then, insert the received measurements into the given database table.

If requested, create the required database table (entries).

Include the class.php file.

Define the sample object of the sample class with its required parameters.

Elicit the total number of data records (samples) for classes (labels) in the given database table.

If the user requests via the HTML form, create a CSV file (sample) for each data record in the given database table, depending on the selected data type: training or testing.

If the Download button is clicked, download all generated CSV files (samples) in the ZIP file format.

First of all, open a terminal window by selecting Accessories ➡ Terminal from the menu.

Then, install the apache2 package by typing the following command into the terminal and pressing Enter:

After installing the apache2 package successfully, open Chromium Web Browser and navigate to localhost so as to test the web server.

Then, enter the command below to the terminal to obtain the Raspberry Pi's IP address:

To install the latest package versions successfully, update the Pi. Then, download the PHP package by entering these commands below to the terminal:

To be able to create files in the ZIP file format with the web application, install the php-zip package:

Since the web application creates a large ZIP file with the generated CSV files (samples), open the php.ini file in order to modify these configurations:

Then, restart the apache server to activate the installed packages on the web server:

First of all, install the MariaDB (MySQL) server and PHP-MySQL packages by entering the following command into the terminal:

To create a new user, run the MySQL secure installation command in the terminal window:

When requested, type the current password for the root user (enter for none). Then, press Enter.

Type in Y and press Enter to set the root password.

Type in bot at the New password: prompt, and press Enter.

Type in Y to remove anonymous users.

Type in Y to disallow root login remotely.

Type in Y to remove the test database and its access permissions.

Type in Y to reload privilege tables.

After successfully setting the MariaDB server, the terminal prints: All done! Thanks for using MariaDB!

Finally, to create a new database in the MariaDB server, run the MySQL interface in the terminal:

Then, enter the recently changed root password - bot.

When the terminal shows the MariaDB [(none)]> prompt, create the new database (foodirradiation) by utilizing these commands below:

Press Ctrl + D to exit the MariaDB [(none)]> prompt.

First of all, install and extract the food_irradiation_data_logger.zip folder.

Then, move the application folder (food_irradiation_data_logger) to the Apache server (/var/www/html) by using the terminal since the Apache server is a protected location.

Since the Apache server is a protected location, it throws an error while attempting to modify the files and folders in it. Therefore, before utilizing the web application to create CSV files (samples) and download them in the ZIP file format, change the web application's folder permission by using the terminal:

If the web application did not receive measurements from Beetle ESP32-C3 via an HTTP GET request, it prints: Waiting Data...

Otherwise, the web application prints: Data received and saved successfully!

If the create_table parameter is set as OK, the web application creates the requested database table (entries) and prints: Database Table Created Successfully!

The application interface shows created sample names and data record numbers for each class in the MySQL database.

If the user clicks the Create Samples submit button on the HTML form, the web application generates CSV files (samples) for Edge Impulse, depending on the selected data type (training or testing).

If the user clicks the Download button, the application downloads all generated CSV files (samples) in the ZIP file format (data.zip).

To add the ESP32-C3 board package to the Arduino IDE, navigate to File ➡ Preferences and paste the URL below under Additional Boards Manager URLs.

Then, to install the required core, navigate to Tools ➡ Board ➡ Boards Manager and search for esp32.

After installing the core, navigate to Tools > Board > ESP32 Arduino and select ESP32C3 Dev Module.

To print data on the serial monitor, enable USB CDC On Boot after setting Beetle ESP32-C3.

Finally, download the required libraries for the Geiger counter module, the I2C HX711 weight sensor, the AS7341 visible light sensor, and the SSD1309 OLED transparent screen:

DFRobot_Geiger | DFRobot_HX711_I2C | DFRobot_AS7341 | U8g2_Arduino |

First of all, download .

Then, upload an image (black and white) and go to Image ➡ Scale Image... to resize the uploaded image.

Go to Image ➡ Mode and select Grayscale.

Finally, export the image as an XBM file.

After exporting the image, add the generated data array to the code and print it on the screen.

Include the required libraries.

Define the Wi-Fi network settings and use the WiFiClient class to create TCP connections.

Define the Geiger counter module.

Define the I2C HX711 weight sensor.

Define the AS7341 visible light sensor settings and objects.

Define the 1.51” SSD1309 OLED transparent display settings.

Define monochrome graphics. Initialize the SSD1309 OLED transparent display.

In the err_msg function, display the error message on the SSD1309 OLED transparent screen.

Check the connection status between the weight (HX711) sensor and Beetle ESP32-C3.

Set the calibration weight (g) and threshold (g) to calibrate the weight sensor automatically.

Display the current calibration value on the serial monitor.

Check the connection status between the AS7341 visible light sensor and Beetle ESP32-C3. Then, enable the built-in LED on the AS7341 sensor.

Initialize the Wi-Fi module.

Attempt to connect to the given Wi-Fi network.

In the get_Weight function, obtain the weight (g) measurement generated by the I2C HX711 weight sensor.

In the get_Visual_Light function, start spectrum measurement with the AS7341 sensor and read the value of sensor data channel 0~5 under these channel mapping modes:

In the activate_Geiger_counter function:

Initialize the Geiger counter module and enable the external interrupt.

Every three seconds, pause the count to turn off the external interrupt trigger.

Evaluate the current CPM (Counts per Minute) by dropping the edge pulse within three seconds: the error is ±3CPM.

Obtain the current nSv/h (nanoSieverts per hour).

Obtain the current μSv/h (microSieverts per hour).

In the drawNumber function, convert numbers to char arrays with the itoa function so as to display them on the SSD1309 OLED transparent screen.

In the home_screen function, display the collected data on the SSD1309 OLED transparent screen.

In the make_a_get_request function:

Connect to the web application named food_irradiation_data_logger.

Create the query string with the collected data.

Make an HTTP GET request with the data parameters to the web application.

Wait until the client is available, then fetch the response from the web application.

If there is a response from the server and the web application appends the transferred data packet to the MySQL database successfully, print Data registered successfully! on the serial monitor and the SSD1309 screen.

According to the pressed control button (A, B, or C), transmit the data packet to the given web application, including the selected food irradiation dose class.

☢ The device waits for the Wi-Fi module to connect to the given Wi-Fi network.

☢ Then, the device displays a modicum of the collected data on the SSD1309 OLED transparent screen.

☢ The device allows the user to collect visible light (color) data at different angles with the movable handle.

☢ If one of the control buttons (A, B, or C) is pressed, the device transmits the recently collected data by adding the selected food irradiation dose class to the given web application.

☢ Then, if the web application appends the transferred data packet to the MySQL database successfully, the device shows this message on the SSD1309 OLED transparent screen: Data registered successfully!

☢ If Beetle ESP32-C3 throws an error while operating, the device shows the error message on the SSD1309 OLED transparent screen and prints the error details on the serial monitor.

☢ Also, the device prints notifications and sensor measurements on the serial monitor for debugging.

☢ The web application shows the total number of data records for classes (labels) in the database.

☢ If the user clicks the Create Samples button, the web application scales data items and generates a CSV file (sample) for each data record, depending on the selected data type (training or testing).

☢ If the user clicks the Download button, the web application downloads all generated CSV files (samples) in the ZIP file format.

You can inspect as a public project.

First of all, sign up for and create a new project.

Navigate to the Data acquisition page and click the Upload existing data button.

Then, choose the data category (training or testing) and select Infer from filename under Label to deduce labels from file names automatically.

Finally, select files and click the Begin upload button.

Go to the Create impulse page. Then, select the Raw Data block and the Classification learning block. Finally, click Save Impulse.

Before generating features for the model, go to the Raw data page and click Save parameters.

After saving parameters, click Generate features to apply the Raw Data block to training samples.

Finally, navigate to the NN Classifier page and click Start training.

To validate the trained model, go to the Model testing page and click Classify all.

To deploy the validated model as an Arduino library, navigate to the Deployment page and select Arduino library.

Then, choose the Quantized (int8) optimization option to get the best performance possible while running the deployed model.

Finally, click Build to download the model as an Arduino library.

After downloading the model as an Arduino library in the ZIP file format, go to Sketch > Include Library > Add .ZIP Library...

Then, include the IoT_AI-driven_Food_Irradiation_Classifier_inferencing.h file to import the Edge Impulse neural network model.

Include the required libraries.

Define the required parameters to run an inference with the Edge Impulse model. Define the features array (buffer) to classify one frame of data.

Define the threshold value (0.60) for the model outputs (predictions).

Define the food irradiation dose class names:

Define monochrome graphics.

Create an array including icons for each food irradiation dose class.

In the run_inference_to_make_predictions function:

Scale (normalize) the collected data depending on the given model and copy the scaled data items to the features array (buffer).

If required, multiply the scaled data items while copying them to the features array (buffer).

Display the progress of copying data to the features buffer on the serial monitor.

If the features buffer is full, create a signal object from the features buffer (frame).

Then, run the classifier.

Print the inference timings on the serial monitor.

Read the prediction (detection) result for each food irradiation dose class (label).

Print the prediction results on the serial monitor.

Obtain the detection result greater than the given threshold (0.60). It represents the most accurate label (food irradiation dose class) predicted by the model.

Print the detected anomalies on the serial monitor, if any.

Finally, clear the features buffer (frame).

If the control button (B) is pressed, start running inference with the Edge Impulse model to predict the food irradiation dose level.

Wait until the Edge Impulse model predicts a food irradiation dose level (label) successfully.

Then, display the prediction (detection) result (class) on the SSD1309 OLED transparent screen with its assigned monochrome icon.

Clear the predicted label (class).

Finally, stop the running inference and return to the home screen.

☢ The device displays a modicum of the collected data on the SSD1309 OLED transparent screen.

☢ If the control button (B) is pressed, the device runs an inference with the Edge Impulse model by filling the features buffer with the recently collected ionizing radiation, weight, and visible light (color) measurements.

☢ When the device starts filling the features buffer with data items, it shows:

☢ Then, the device displays the detection result, which represents the most accurate label (food irradiation dose class) predicted by the model.

☢ Each food irradiation dose level (class) has a unique monochrome icon to be shown on the SSD1309 OLED transparent screen when being predicted (detected) by the model:

☢ Also, the device prints notifications and sensor measurements on the serial monitor for debugging.

☢ prevent changes to the packaging that might affect integrity as a barrier to microbial contamination,

☢ avert producing radiolysis products that could migrate into food, affecting odor, taste, and possibly the safety of the food,

☢ preclude inadvertent radiation effects on polymers in food packaging due to competing crosslinking or chain scission reactions.

#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
⭐
⭐
⭐
⭐
⭐
⭐
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
🍱
🍱
🍱
🍱
🍱
🍱
🍱
🍱
🍱
🍱
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
#️⃣
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
⭐
🍱
🍱
🍱
🍱
🍱
🍱
🍱
🍱
🍱
🎁
🎨
DFRobot
⭐
Inspect
⭐
Inspect
⭐
Inspect
⭐
Inspect
⭐
Inspect
🎁
🎨
my $5 discount coupon
🎁
🎨
Creality
Creality CR-200B 3D Printer
Hulk
Beetle ESP32-C3
the SSD1309 OLED transparent screen (Fermion)
the weight sensor kit
the Geiger counter module
Download
Download
Download
Download
#️⃣
GIMP
my neural network model on Edge Impulse
#️⃣
Edge Impulse
Data Collection | IoT AI-driven Food Irradiation Dose Detector w/ Edge Impulse
Experimenting with the model | IoT AI-driven Food Irradiation Dose Detector w/ Edge Impulse
https://studio.edgeimpulse.com/public/109647/latest
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image
image