LogoLogo
HomeDocsAPI & SDKsProjectsForumStudio
  • Welcome
    • Featured Machine Learning Projects
      • Getting Started with the Edge Impulse Nvidia TAO Pipeline - Renesas EK-RA8D1
      • Smart City Traffic Analysis - NVIDIA TAO + Jetson Orin Nano
      • ROS 2 Pick and Place System - Arduino Braccio++ Robotic Arm and Luxonis OAK-D
      • Optimize a cloud-based Visual Anomaly Detection Model for Edge Deployments
      • Rooftop Ice Detection with Things Network Visualization - Nvidia Omniverse Replicator
      • Surgery Inventory Object Detection - Synthetic Data - Nvidia Omniverse Replicator
      • NVIDIA Omniverse - Synthetic Data Generation For Edge Impulse Projects
      • Community Guide – Using Edge Impulse with Nvidia DeepStream
      • Computer Vision Object Counting - Avnet RZBoard V2L
      • Gesture Appliances Control with Pose Detection - BrainChip AKD1000
      • Counting for Inspection and Quality Control - Nvidia Jetson Nano (TensorRT)
      • High-resolution, High-speed Object Counting - Nvidia Jetson Nano (TensorRT)
    • Prototype and Concept Projects
      • Renesas CK-RA6M5 Cloud Kit - Getting Started with Machine Learning
      • TI CC1352P Launchpad - Getting Started with Machine Learning
      • OpenMV Cam RT1062 - Getting Started with Machine Learning
      • Getting Started with Edge Impulse Experiments
  • Computer Vision Projects
    • Workplace Organizer - Nvidia Jetson Nano
    • Recyclable Materials Sorter - Nvidia Jetson Nano
    • Analog Meter Reading - Arduino Nicla Vision
    • Creating Synthetic Data with Nvidia Omniverse Replicator
    • SonicSight AR - Sound Classification with Feedback on an Augmented Reality Display
    • Traffic Monitoring - Brainchip Akida
    • Multi-camera Video Stream Inference - Brainchip Akida
    • Industrial Inspection Line - Brainchip Akida
    • X-Ray Classification and Analysis - Brainchip Akida
    • Inventory Stock Tracker - FOMO - BrainChip Akida
    • Container Counting - Arduino Nicla Vision
    • Smart Smoke Alarm - Arduino Nano 33
    • Shield Bot Autonomous Security Robot
    • Cyclist Blind Spot Detection - Himax WE-I Plus
    • IV Drip Fluid-Level Monitoring - Arduino Portenta H7
    • Worker PPE Safety Monitoring - Nvidia Jetson Nano
    • Delivered Package Detection - ESP-EYE
    • Bean Leaf Disease Classification - Sony Spresense
    • Oil Tank Measurement Using Computer Vision - Sony Spresense
    • Object Counting for Smart Industries - Raspberry Pi
    • Smart Cashier with FOMO - Raspberry Pi
    • PCB Defect Detection with Computer Vision - Raspberry Pi
    • Bicycle Counting - Sony Spresense
    • Counting Eggs with Computer Vision - OpenMV Cam H7
    • Elevator Passenger Counting - Arduino Nicla Vision
    • ESD Protection using Computer Vision - Seeed ReComputer
    • Solar Panel Defect Detection - Arduino Portenta H7
    • Label Defect Detection - Raspberry Pi
    • Dials and Knob Monitoring with Computer Vision - Raspberry Pi
    • Digital Character Recognition on Electric Meter System - OpenMV Cam H7
    • Corrosion Detection with Computer Vision - Seeed reTerminal
    • Inventory Management with Computer Vision - Raspberry Pi
    • Monitoring Retail Checkout Lines with Computer Vision - Renesas RZ/V2L
    • Counting Retail Inventory with Computer Vision - Renesas RZ/V2L
    • Pose Detection - Renesas RZ/V2L
    • Product Quality Inspection - Renesas RZ/V2L
    • Smart Grocery Cart Using Computer Vision - OpenMV Cam H7
    • Driver Drowsiness Detection With FOMO - Arduino Nicla Vision
    • Gastroscopic Image Processing - OpenMV Cam H7
    • Pharmaceutical Pill Quality Control and Defect Detection
    • Deter Shoplifting with Computer Vision - Texas Instruments TDA4VM
    • Smart Factory Prototype - Texas Instruments TDA4VM
    • Correct Posture Detection and Enforcement - Texas Instruments TDA4VM
    • Visual Anomaly Detection with FOMO-AD - Texas Instruments TDA4VM
    • Surface Crack Detection and Localization - Texas Instruments TDA4VM
    • Surface Crack Detection - Seeed reTerminal
    • Retail Image Classification - Nvidia Jetson Nano
    • SiLabs xG24 Plus Arducam - Sorting Objects with Computer Vision and Robotics - Part 1
    • SiLabs xG24 Plus Arducam - Sorting Objects with Computer Vision and Robotics - Part 2
    • Object Detection and Visualization - Seeed Grove Vision AI Module
    • Bike Rearview Radar - Raspberry Pi
    • Build a Self-Driving RC Vehicle - Arduino Portenta H7 and Computer Vision
    • "Bring Your Own Model" Image Classifier for Wound Identification
    • Acute Lymphoblastic Leukemia Classifier - Nvidia Jetson Nano
    • Hardhat Detection in Industrial Settings - Alif Ensemble E7
    • Motorcycle Helmet Identification and Traffic Light Control - Texas Instruments AM62A
    • Import a Pretrained Model with "Bring Your Own Model" - Texas Instruments AM62A
    • Product Inspection with Visual Anomaly Detection - FOMO-AD - Sony Spresense
    • Visual Anomaly Detection in Fabric using FOMO-AD - Raspberry Pi 5
    • Car Detection and Tracking System for Toll Plazas - Raspberry Pi AI Kit
    • Visual Anomaly Detection - Seeed Grove Vision AI Module V2
    • Object Counting with FOMO - OpenMV Cam RT1062
    • Visitor Heatmap with FOMO Object Detection - Jetson Orin Nano
    • Vehicle Security Camera - Arduino Portenta H7
  • Audio Projects
    • Occupancy Sensing - SiLabs xG24
    • Smart Appliance Control Using Voice Commands - Nordic Thingy:53
    • Glass Window Break Detection - Nordic Thingy:53
    • Illegal Logging Detection - Nordic Thingy:53
    • Illegal Logging Detection - Syntiant TinyML
    • Wearable Cough Sensor and Monitoring - Arduino Nano 33 BLE Sense
    • Collect Data for Keyword Spotting - Raspberry Pi Pico
    • Voice-Activated LED Strip - Raspberry Pi Pico
    • Snoring Detection on a Smart Phone
    • Gunshot Audio Classification - Arduino Nano 33 + Portenta H7
    • AI-Powered Patient Assistance - Arduino Nano 33 BLE Sense
    • Acoustic Pipe Leakage Detection - Arduino Portenta H7
    • Location Identification using Sound - Syntiant TinyML
    • Environmental Noise Classification - Nordic Thingy:53
    • Running Faucet Detection - Seeed XIAO Sense + Blues Cellular
    • Vandalism Detection via Audio Classification - Arduino Nano 33 BLE Sense
    • Predictive Maintenance Using Audio Classification - Arduino Nano 33 BLE Sense
    • Porting an Audio Project from the SiLabs Thunderboard Sense 2 to xG24
    • Environmental Audio Monitoring Wearable - Syntiant TinyML - Part 1
    • Environmental Audio Monitoring Wearable - Syntiant TinyML - Part 2
    • Keyword Spotting - Nordic Thingy:53
    • Detecting Worker Accidents with Audio Classification - Syntiant TinyML
    • Snoring Detection with Syntiant NDP120 Neural Decision Processor - Arduino Nicla Voice
    • Recognize Voice Commands with the Particle Photon 2
    • Voice Controlled Power Plug with Syntiant NDP120 (Nicla Voice)
    • Determining Compressor State with Audio Classification - Avnet RaSynBoard
    • Developing a Voice-Activated Product with Edge Impulse's Synthetic Data Pipeline
    • Enhancing Worker Safety using Synthetic Audio to Create a Dog Bark Classifier
  • Predictive Maintenance and Defect Detection Projects
    • Predictive Maintenance - Nordic Thingy:91
    • Brushless DC Motor Anomaly Detection
    • Industrial Compressor Predictive Maintenance - Nordic Thingy:53
    • Anticipate Power Outages with Machine Learning - Arduino Nano 33 BLE Sense
    • Faulty Lithium-Ion Cell Identification in Battery Packs - Seeed Wio Terminal
    • Weight Scale Predictive Maintenance - Arduino Nano 33 BLE Sense
    • Fluid Leak Detection With a Flowmeter and AI - Seeed Wio Terminal
    • Pipeline Clog Detection with a Flowmeter and AI - Seeed Wio Terminal
    • Refrigerator Predictive Maintenance - Arduino Nano 33 BLE Sense
    • Motor Pump Predictive Maintenance - Infineon PSoC 6 WiFi-BT Pioneer Kit + CN0549
    • BrickML Demo Project - 3D Printer Anomaly Detection
    • Condition Monitoring - Syntiant TinyML Board
    • Predictive Maintenance - Commercial Printer - Sony Spresense + CommonSense
    • Vibration Classification with BrainChip's Akida
    • AI-driven Audio and Thermal HVAC Monitoring - SeeedStudio XIAO ESP32
  • Accelerometer and Activity Projects
    • Arduino x K-Way - Outdoor Activity Tracker
    • Arduino x K-Way - Gesture Recognition for Hiking
    • Arduino x K-Way - TinyML Fall Detection
    • Posture Detection for Worker Safety - SiLabs Thunderboard Sense 2
    • Hand Gesture Recognition - OpenMV Cam H7
    • Arduin-Row, a TinyML Rowing Machine Coach - Arduino Nicla Sense ME
    • Fall Detection using a Transformer Model – Arduino Giga R1 WiFi
    • Bluetooth Fall Detection - Arduino Nano 33 BLE Sense
    • Monitor Packages During Transit with AI - Arduino Nano 33 BLE Sense
    • Smart Baby Swing - Arduino Portenta H7
    • Warehouse Shipment Monitoring - SiLabs Thunderboard Sense 2
    • Gesture Recognition - Bangle.js Smartwatch
    • Gesture Recognition for Patient Communication - SiLabs Thunderboard Sense 2
    • Hospital Bed Occupancy Detection - Arduino Nano 33 BLE Sense
    • Porting a Posture Detection Project from the SiLabs Thunderboard Sense 2 to xG24
    • Porting a Gesture Recognition Project from the SiLabs Thunderboard Sense 2 to xG24
    • Continuous Gait Monitor (Anomaly Detection) - Nordic Thingy:53
    • Classifying Exercise Activities on a BangleJS Smartwatch
  • Air Quality and Environmental Projects
    • Arduino x K-Way - Environmental Asthma Risk Assessment
    • Gas Detection in the Oil and Gas Industry - Nordic Thingy:91
    • Smart HVAC System with a Sony Spresense
    • Smart HVAC System with an Arduino Nicla Vision
    • Indoor CO2 Level Estimation - Arduino Portenta H7
    • Harmful Gases Detection - Arduino Nano 33 BLE Sense
    • Fire Detection Using Sensor Fusion and TinyML - Arduino Nano 33 BLE Sense
    • AI-Assisted Monitoring of Dairy Manufacturing Conditions - Seeed XIAO ESP32C3
    • AI-Assisted Air Quality Monitoring - DFRobot Firebeetle ESP32
    • Air Quality Monitoring with Sipeed Longan Nano - RISC-V Gigadevice
    • Methane Monitoring in Mines - Silabs xG24 Dev Kit
    • Smart Building Ventilation with Environmental Sensor Fusion
    • Sensor Data Fusion with Spresense and CommonSense
    • Water Pollution Detection - Arduino Nano ESP32 + Ultrasonic Scan
    • Fire Detection Using Sensor Fusion - Arduino Nano 33 BLE Sense
  • Novel Sensor Projects
    • 8x8 ToF Gesture Classification - Arduino RP2040 Connect
    • Food Irradiation Dose Detection - DFRobot Beetle ESP32C3
    • Applying EEG Data to Machine Learning, Part 1
    • Applying EEG Data to Machine Learning, Part 2
    • Applying EEG Data to Machine Learning, Part 3
    • Liquid Classification with TinyML - Seeed Wio Terminal + TDS Sensor
    • AI-Assisted Pipeline Diagnostics and Inspection with mmWave Radar
    • Soil Quality Detection Using AI and LoRaWAN - Seeed Sensecap A1101
    • Smart Diaper Prototype - Arduino Nicla Sense ME
    • DIY Smart Glove with Flex Sensors
    • EdgeML Energy Monitoring - Particle Photon 2
    • Wearable for Monitoring Worker Stress using HR/HRV DSP Block - Arduino Portenta
  • Software Integration Demos
    • Azure Machine Learning with Kubernetes Compute and Edge Impulse
    • ROS2 + Edge Impulse, Part 1: Pub/Sub Node in Python
    • ROS2 + Edge Impulse, Part 2: MicroROS
    • Using Hugging Face Datasets in Edge Impulse
    • Using Hugging Face Image Classification Datasets with Edge Impulse
    • Edge Impulse API Usage Sample Application - Jetson Nano Trainer
    • MLOps with Edge Impulse and Azure IoT Edge
    • A Federated Approach to Train and Deploy Machine Learning Models
    • DIY Model Weight Update for Continuous AI Deployments
    • Automate the CI/CD Pipeline of your Models with Edge Impulse and GitHub Actions
    • Deploying Edge Impulse Models on ZEDEDA Cloud Devices
Powered by GitBook
On this page
  • Introduction
  • The Challenge
  • Our Solution
  • Hardware Requirements
  • Software requirements
  • Hardware Setup
  • Software Setup
  • Creating the Build Environment
  • Creating an Edge Impulse Project
  • Connecting the Device
  • Collecting Training Data
  • Designing the Impulse
  • Configuring the Spectral Analysis Block
  • Configure the Classification (Keras) Block
  • Conclusion

Was this helpful?

Edit on GitHub
Export as PDF
  1. Air Quality and Environmental Projects

Methane Monitoring in Mines - Silabs xG24 Dev Kit

Detecting the presence of methane using a SiLabs xG24 Dev Kit and a gas sensor.

PreviousAir Quality Monitoring with Sipeed Longan Nano - RISC-V GigadeviceNextSmart Building Ventilation with Environmental Sensor Fusion

Last updated 1 year ago

Was this helpful?

Created By:

Public Project Link:

GitHub Repository:

Introduction

Methane is a colorless, odorless gas that is the main component of natural gas. It is also a common by-product of coal mining. When methane is present in high concentrations, it can be explosive. For this reason, methane monitoring is essential for the safety of workers in mines and other workplaces where methane may be present.

There are many different ways to monitor methane levels. Some methods, such as fixed gas monitors, are designed to provide constant readings from a set location. Others, such as personal portable gas monitors, are designed to be carried by individual workers so that they can take immediate action if methane levels rise to dangerous levels.

Most countries have regulations in place that require the monitoring of methane levels in mines and other workplaces. These regulations vary from country to country, but they all have the same goal: to keep workers safe from the dangers of methane gas exposure.

The Challenge

There are many different methane monitoring systems on the market, but choosing the right one for your workplace can be a challenge. There are a few things you should keep in mind when choosing a methane monitor:

  1. The type of work environment: Methane monitors come in a variety of shapes and sizes, each designed for a specific type of work environment. You will need to choose a methane monitor that is designed for use in the type of workplace where it will be used. For example, personal portable gas monitors are designed to be worn by individual workers, while fixed gas monitors are designed to be placed in a specific location.

  2. The size of the workplace: The size of the workplace will determine how many methane monitors you will need. For example, a small mine might only require a few fixed gas monitors, while a large mine might require dozens.

  3. The methane concentration: The level of methane present in the workplace will determine how often the methane monitor needs to be used. For example, in a workplace with a high concentration of methane, the monitor may need to be used more frequently than in a workplace with a low concentration of methane.

Choosing the right methane monitor for your workplace can be a challenge, but it is an important part of keeping your workers safe from the dangers of methane gas.

Our Solution

The Silabs EFR32xG24 Dev Kit is the perfect solution for methane monitoring in mines and other workplaces. It features a Machine Learning (ML) hardware accelerator that can be used to develop custom gas detection algorithms. The Silabs EFR32xG24 comes equipped with a pressure sensor, ambient light sensor, and hall-effect sensor, all of which can be used to further customize the monitoring system.

Hardware Requirements

  • Micro USB cable

  • 3D printed enclosure

  • Prototyping wires

Software requirements

  • Edge Impulse account

  • Edge Impulse CLI

Hardware Setup

To use the MQ-4 sensor with the Silabs EFR32xG24 Dev Kit, you will need to connect the sensor to the board as follows:

  • Connect the VCC pin on the sensor to the 3.3V pin on the board.

  • Connect the GND pin on the sensor to the GND pin on the board.

  • Connect the A0 pin on the sensor to the A0 pin on the board.

With the hardware set up, you are ready to begin developing your methane monitoring system.

Software Setup

Creating the Build Environment

In the next screen, you should see your device name:

Afterwards, select Auto in the Package Installation Options menu and click Next.

This code sample reads analog values from the Methane sensor connected to Pin 16 of the dev board, which corresponds to PC05 (Pin 5, Port C). If you want to change the pin, you’ll have to update the following lines of code:

#define IADC_INPUT_0_PORT_PIN     iadcPosInputPortCPin5;

#define IADC_INPUT_0_BUS          CDBUSALLOC // ABUSALLOC
#define IADC_INPUT_0_BUSALLOC     GPIO_CDBUSALLOC_CDODD0_ADC0 // GPIO_ABUSALLOC_AODD0_ADC0

Next up, right click on the project name in the Project Explorer menu, click on Run as and select Silicon Labs ARM program. The Device Selection menu will pop up and you’ll have to select your board.

In case this warning pops up click Yes:

To check out the values printed by the dev board, you can use the Arduino IDE serial monitor or Picocom.

Creating an Edge Impulse Project

To get started, you will need to create an Edge Impulse project. Edge Impulse is a Machine Learning platform that makes it easy to develop custom algorithms for a variety of applications, including methane monitoring.

To create an Edge Impulse project, simply log in or sign up for an account at https://www.edgeimpulse.com/. Once you have an account, click the "Create new project" button on the dashboard.

You will be asked to give your project a name and select a category. For this project, we will be using the "Custom classification" template. Give your project a name and description, then click the "Create project" button.

With your Edge Impulse project created, you are ready to begin developing your methane detection algorithm.

Connecting the Device

To populate the data pool, connect the board to the computer using a micro-USB cable, launch a terminal and run:

edge-impulse-data-forwarder

After entering your account information (username and password), you must first choose a project for the device to be assigned to.

Edge Impulse data forwarder v1.15.1
? What is your user name or e-mail address (edgeimpulse.com)? zalmotek
? What is your password? [hidden]
Endpoints:
    Websocket: wss://remote-mgmt.edgeimpulse.com
    API:       https://studio.edgeimpulse.com/v1
    Ingestion: https://ingestion.edgeimpulse.com

[SER] Connecting to /dev/ttyACM0
[SER] Serial is connected (00:04:40:25:80:81)
[WS ] Connecting to wss://remote-mgmt.edgeimpulse.com
[WS ] Connected to wss://remote-mgmt.edgeimpulse.com

? To which project do you want to connect this device? (Use arrow keys)

You will then be prompted to name the sensor axis that the edge-impulse-data-forwarder picked up.

? To which project do you want to connect this device? Zalmotek / Methane Monito
ring - Silabs EFR32xG24 Dev Kit
[SER] Detecting data frequency...
[SER] Detected data frequency: 1925Hz
? 1 sensor axes detected (example values: [17]). What do you want to call them? 
Separate the names with ',': Methane

If everything went well, the development board will appear on your project's Devices tab with a green dot next to it, signifying that it is online and prepared for data collection.

Collecting Training Data

The first step in developing a Machine Learning algorithm is to collect training data. This data will be used to train the algorithm so that it can learn to recognize the patterns that indicate the presence of methane gas.

To collect training data, you will need to use the methane sensor to take readings in a variety of conditions, both with and without methane gas present. For each reading, you will need to take note of the concentration of methane present, as well as the ambient temperature and humidity.

It is important to collect a variety of data points, as this will give the algorithm a better chance of learning to recognize the patterns that indicate the presence of methane gas. Try to take readings in different places, at different times of day, and in different weather conditions. If possible, it is also helpful to take readings with different people so that the algorithm can learn to recognize the patterns that are specific to each situation.

Now, you will need to configure the sensor settings. For this project, we will be using the following settings:

  • Sensor: MQ-4

  • Board sampling rate: 1924 Hz

  • Data recording length: 10 seconds

With these settings configured, click the "Start sampling" button to begin collecting data. The data will be automatically stored in the Edge Impulse cloud and can be used to train your Machine Learning algorithm.

Machine learning algorithms need to be trained on data that is representative of real-world data that they will encounter when deployed on the edge. For this reason, it is important to split the data into training and testing sets, the first of them being used during the training process of the neural network and the other one will be used to evaluate the performance of said NN.

Designing the Impulse

With the training data collected, you are ready to begin designating your methane detection algorithm. In Edge Impulse, algorithms are designed using a drag-and-drop interface called the "Impulse designer".

To access the impulse designer, click the "Design impulse" button on your project's dashboard. You will be presented with a list of available blocks that can be used to build your algorithm. For this project, we will be using the following blocks:

Configuring the Spectral Analysis Block

After you click “Save impulse,” you will notice that each block may be configured by clicking on its name under the “Impulse Design” submenu. The Spectral Analysis block is one of the simplest processing blocks since it only has a few adjustable parameters. On the upper part of the screen, you can see a time-domain representation of the sample that was selected.

Configure the Classification (Keras) Block

So, how does a neural network know what predictions to make? The answer lies in its many layers, where each layer is connected to another through neurons. At the beginning of the training process, connection weights between neurons are randomly determined.

A neural network is designed to predict a set of results from a given set of data, which we call training data. This works by first presenting the network with the training data, and then checking its output against the correct answer. Based on how accurate the prediction was, the connection weights between neurons are adjusted. We repeat this process multiple times until predictions for new inputs become more and more accurate.

When configuring this block, there are multiple parameters that can be modified:

The number of training cycles is defined as the total number of epochs completed in a certain amount of time. Each time the training algorithm makes one complete pass through all of the learning data with back-propagation and modifies the model's parameters as it goes, it is known as an epoch or training cycle (Figure 1).

The learning rate controls how much the model's internal parameters are updated during each step of the training process, or in other words, how quickly the neural network will learn. If the network overfits too rapidly, you can lower the learning rate.

Auto-balance dataset mixes in more copies of data from classes that are uncommon. This function might help make the model more robust against overfitting if you have little data for some classes.

Conclusion

Even though methane leaks in mining operations are a well-known problem with a variety of market-available solutions, conventional threshold-based detection systems have some drawbacks, including the need for remote data processing and the challenges of ensuring connectivity in mining-specific environments.

Edge AI technologies allow users to circumvent those weak points by embedding the whole signal acquisition, data processing and decision making on a single board that can run independently from any wireless network and which can raise an alarm if dangerous trends in the methane levels in the facility are detected.

To shield the electronic system from the harsh environment specific to the mining industry, we have designed and 3D printed a case that exposes only the transductor of the methane sensor and offers the possibility of mounting the ensemble on a hard surface. The files for the enclosure can be .

The EFR32xG24 Dev Kit has an on-board USB J-Link Debugger so you’ll need to install the as well as the to be able to program it. Connect the board to your computer using the micro-USB cable and run the Simplicity Studio installer. In the Installation Manager, choose Install by connecting device(s).

Now that you have the setup for the build environment, download the data collection project and open it with Simplicity Studio from File -> Import project.

found here
Silicon Labs EFR32xG24 Dev Kit
MQ-4 gas sensor
Simplicity Studio IDE
J-Link Software
J-Link Software and Documentation pack
Simplicity Studio IDE
from here
Zalmotek
https://studio.edgeimpulse.com/studio/158034
https://github.com/Zalmotek/edge-impulse-methane-monitoring-with-silabsxg24