Workplace Safety: Fire Detection through Machine Vision  Annepu Vivek November 4, 2025

Workplace Safety: Fire Detection through Machine Vision 

Workplace Safety Fire Detection through Machine Vision

Why Real-Time Fire Detection Matters in High-Risk Industries

In high-risk industries like oil and gas, real-time fire detection is critical for operational safety and compliance. The AcuPrism Vision model is an AI-based Machine Vision (AI-MV) solution designed to detect fire hazards instantly using deep learning. This aligns with visual fire detection using deep learning, fire and smoke detection through Machine Vision, and broader machine vision-based fire detection techniques. 

To achieve low-latency detection, the solution uses edge computing powered by NVIDIA Jetson devices, ensuring inference occurs close to the data source rather than relying on cloud processing. 

This blog explores the end-to-end development and operational pipeline of AcuPrism Vision, highlighting: 

  • Model training and versioning in the cloud (Databricks, MLflow) 
  • Edge deployment on NVIDIA Jetson using Azure IoT Edge 
  • Integration with Azure Blob Storage, Kafka, and Power Platform services for scalable, automated fire detection 

Edge Capabilities

The real-time nature of fire detection makes edge computing essential. In this solution: 

  • The YOLOv8 fire detection model, trained in Databricks, is containerized and deployed on NVIDIA Jetson devices.  
  • Azure IoT Edge manages deployment, monitoring, and updates of the containerized model on edge devices. 
  • A Kafka-based producer–consumer pipeline runs on the Jetson VM: 
    • Producer: Captures live video frames from an RTSP camera feed, encodes frames in Base64, and streams them to Kafka topics. 
    • Consumer: Consumes frames, runs YOLOv8 inference locally on the Jetson GPU, and sends detection results to Azure services for alerting and logging. 

    This approach ensures sub-second latency, reliable detection, and seamless integration with the cloud for analytics and monitoring enabling real-time fire detection through Machine Vision. 

System Overview: Cloud Training + Edge Inference for Real-Time Detection

The AcuPrism Vision model is an advanced AI-based Machine Vision (AI-MV) solution designed to enhance safety in industrial environments by detecting fire incidents in real time. The solution uses a hybrid architecture cloud-based model development with edge-based inference to ensure ultra-low latency and scalability. It demonstrates fire detection through Machine Vision across diverse operating conditions. 

Key Highlights

  • Cloud Training (Databricks Environment):  
    The fire detection model is developed, trained, and versioned in Databricks using YOLOv8 for accurate object detection of fire and smoke patterns.  
  • Edge Deployment (NVIDIA Jetson + Azure IoT Edge):  
    The trained model is containerized and deployed on NVIDIA Jetson devices, enabling real-time inference at the edge with Azure IoT Edge for orchestration and updates.  
  • AI-Powered Object Detection:  
    The system continuously monitors critical industrial areas, detecting fire or smoke early to reduce environmental risks, ensure compliance, and improve workplace safety.  
  • Seamless Device Integration:  
    Works with CCTV cameras, dedicated IP cameras, drones, and robotic inspection devices for automated detection and rapid response.  
  • Data Streaming with Kafka:  
    A Kafka-based producer-consumer pipeline handles real-time video stream processing on the edge:  
    • Producer: Captures live video frames from RTSP cameras and streams them via Kafka.  
    • Consumer: Runs inference locally on Jetson, sends detection results to Azure Blob Storage and triggers alerts.
       

End-to-End Workflow for Fire Detection through Machine Vision

The AcuPrism Vision architecture is designed as a hybrid system, leveraging Databricks for cloud-based model development and training and edge devices (NVIDIA Jetson) for real-time inference. This approach ensures scalability, low latency, and continuous monitoring. 

AcuPrism Architecture: Cloud–Edge Design for Industrial Fire Detection

The architecture includes two layers: 

  • Cloud (Databricks): For model training, hyperparameter tuning, and versioning. 
  • Edge (NVIDIA Jetson + Azure IoT Edge): For real-time fire detection near the data source, ensuring sub-second response times. 

Implementation Stages: Data, Training, Packaging, Edge Deployment

  1. Data Processing and Feature Engineering (Cloud) 
    Raw video frames are captured from CCTV cameras, RTSP streams, or drones. Data is sent to Databricks for preprocessing, augmentation, and feature engineering to prepare for model training. 
  2. Model Training and Testing (Cloud) 
    The YOLOv8 model is trained on annotated fire and smoke datasets in Databricks. This includes hyperparameter tuning, data augmentation, and cross-validation for accuracy.  
  3. Model Registration and Deployment 
    The trained model is registered in MLflow and stored in Azure Blob Storage for version control. A containerized version of the model is created for edge deployment. 
  4. Edge Deployment (Operational Layer) 
    The containerized YOLOv8 model is deployed to NVIDIA Jetson devices using Azure IoT Edge. Real-time inference happens on the edge, ensuring ultra-low latency fire detection. 
    A Kafka-based producer–consumer pipeline runs on the Jetson device: 
    • Producer: Captures RTSP video frames, encodes in Base64, and sends via Kafka. 
    • Consumer: Consumes frames, runs YOLO inference on the Jetson GPU, and publishes detection results. 
  5. Azure Function App 
    Validates Kafka credentials provided by Power Apps and handles secure message flow between Jetson and Azure cloud for logging and automation. 
  6. Power Apps 
    Allows authorized users to manage and configure Kafka credentials and initiate workflows for edge-device operations. 
  7. Azure Blob Storage 
    Stores detection results, metadata, and logs for historical analysis, including timestamps, detection confidence, and model version details. 
  8. Power Automate 
    Triggers workflows when new detection data is added to Blob Storage, updates SharePoint lists, sends alerts to Teams and Outlook, and creates incident reports automatically. 
  9. Power BI 
    Provides near real-time dashboards showing detection insights, trends, and analytics. These dashboards are not for immediate alerting but are essential for monitoring and decision-making. 

Cloud–Edge Architecture for Real-Time Fire & Smoke Detection

Cloud–Edge Architecture for Real-Time Fire & Smoke Detection

In this section, we take a closer look at both the development and operational data flows of the AcuPrism Vision system. These flows describe how data moves through the various stages from initial data collection and model training during development to real-time video processing and fire detection during operation. By examining these flows, we can understand how different components interact, how data is transformed at each step, and how the system maintains a seamless, automated workflow to ensure accurate and timely fire incident detection in industrial environments. 

Architecture Diagram & Data Flow (Grouped Object Workflow)
Architecture Diagram & Data Flow (Grouped Object Workflow)

Pipeline Details: From Ingestion to Edge Inference

The end-to-end pipeline for the AI-based Machine Vision (AI-MV) fire detection solution is designed for hybrid execution, where model development and training happen in the cloud (Databricks), and real-time inference happens on the edge (NVIDIA Jetson with Azure IoT Edge). 

Pipeline Setup and Job Execution

The pipeline automates the entire process from data ingestion and model training to edge deployment and real-time detection. It consists of the following stages: 

  1. Data Loader (Cloud – Databricks) 
    Captures raw video frames and images from industrial environments. 
    Performs data cleaning, augmentation, and feature engineering to ensure model quality. 
  2. Model Training (Cloud – Databricks) 
    Trains the YOLOv8 model on annotated fire and smoke datasets. 
    Uses advanced techniques such as: 
    • Hyperparameter tuning (learning rates, anchor sizes, augmentation) 
    • Cross-validation (k-fold for robust evaluation) 
    • Automation: Training and validation pipelines automated using Databricks Jobs 
  3. Model Registration & Storage 
    Registers the model with MLflow for version control. 
    Stores model checkpoint files (e.g., model.pt) in Azure Blob Storage for deployment. 

Edge Deployment (Operational Layer)

Unlike traditional cloud-only solutions, operational inference happens on NVIDIA Jetson devices using Azure IoT Edge for container management. The trained YOLOv8 model is containerized and deployed on Jetson for real-time detection. A Kafka-based producer–consumer pipeline runs on the Jetson device: 

  • Producer: Captures live frames from RTSP streams and publishes them to Kafka. 
  • Consumer: Consumes frames locally, performs inference on the Jetson GPU, and sends detection results (fire alerts, bounding boxes, confidence scores) to Azure services.


This setup ensures ultra-low latency, making it ideal for safety-critical environments that rely on
fire and smoke detection through Machine Vision and other machine vision-based fire detection techniques. 

Scalable Cloud Integration

While real-time detection runs at the edge, the cloud provides: 

  • Centralized Model Management: Updates and versioning via Databricks and MLflow. 
  • AKS for Scaling APIs (Optional): If needed, Databricks + AKS can serve inference for batch processing or as a backup when edge devices are offline.
     

Real-Time Event Processing & Alerting

Fire Detection Data Flow with Power Automate 
Triggered when detection results are uploaded to Azure Blob Storage. 
Automatically: 

  • Generates a secure SAS link for detection evidence 
  • Creates a log entry in SharePoint 
  • Sends real-time alerts via Microsoft Teams and Outlook 

Power BI Dashboards 
Displays near real-time analytics for fire incidents from Blob Storage. 
Not for instant alerts but for operational reporting and decision-making. 

Power BI Dashboards

Microsoft Teams & Outlook Alerts 
Instant notifications with image evidence ensure rapid response without continuous monitoring of dashboards. 

Business Impact and Results

Detecting safety violations

Conclusion

This solution combines: 

  • Databricks (Cloud) for training, versioning, and automation 
  • Azure IoT Edge + NVIDIA Jetson (Edge) for real-time inference 
  • Kafka for high-speed video streaming 
  • Power Automate, Teams, Outlook, and Power BI for alerts and insights 

By leveraging a hybrid edge cloud architecture, it delivers low-latency detection, automated workflows, and scalable monitoring ensuring maximum workplace safety through fire detection through Machine Vision and real-time fire detection through Machine Vision. 

Fire Detection through Machine Vision - FAQs

Yes. Inference runs on the edge (NVIDIA Jetson), so only detection metadata/evidence is sent to the cloud. RTSP feeds stay local; Kafka handles streaming on-device; Azure IoT Edge manages the model container. This preserves real-time fire and smoke detection in industries even when links are slow or intermittent. 

Keep your CCTV/IP cameras. RTSP frames go to Jetson, where a containerized YOLOv8 model trained in Databricks and versioned with MLflow performs fire detection from image and video in real time. The Kafka producer streams frames; the consumer runs GPU inference and pushes detections to Azure for alerting and logging.

Train/tune in Databricks, register in MLflow, store artifacts in Azure Blob, containerize, and roll out via Azure IoT Edge to each plant’s Jetson device. IoT Edge also monitors and updates models scaling machine vision-based fire detection techniques consistently across factories.

When detections land in Azure Blob Storage, Power Automate creates a secure SAS evidence link, logs the incident in SharePoint, and sends instant alerts via Microsoft Teams and Outlook. Power BI dashboards provide near real-time trends for incident review and decisions.

Yes. Edge inference on Jetson delivers sub-second detection/alerts right at turbines, switchyards, or control rooms no round-trip to cloud needed. This is ideal for safety-critical zones that require immediate action.

Lighting, occlusions, weather, and camera angles. Mitigate with good placement/illumination, robust data augmentation, hyperparameter tuning and cross-validation in Databricks, continuous MLflow versioning, and edge inference. These practices keep AI-powered fire and smoke detection in industries reliable day-to-day.