This project focuses on monitoring social distancing through a cutting-edge image processing and computer vision model. Leveraging YOLOv4-based Deep Neural Network technology, the model achieves exceptional accuracy in detecting human interactions in surveillance camera footage.
-
High Detection Accuracy: The model achieves a remarkable 92% detection rate for humans, providing precise monitoring capabilities.
-
Dynamic Risk Assessment: An implemented dynamic risk assessment algorithm identifies infection zones with a precision of 90%, contributing significantly to public health efforts.
-
Social Distance Quantification: A frequency analysis module quantifies social distance violations on city lanes with an impressive 98% accuracy rate.
-
Real-time Monitoring: Positive user feedback highlights an 80% improvement in real-time monitoring, enhancing the model's effectiveness in detecting potential health risks promptly.
-
Enhanced Model Interoperability: The model's interoperability is enhanced through detailed visualizations of infection zones, aiding city planners and health officials in better decision-making.
-
Scalability Demonstration: The model demonstrates scalability by processing camera feeds in real-time, impacting the safety and well-being of over 1000 residents in the city.