Aim:
Ā Ā Ā Ā Ā Ā Ā The Main Objective of the project is blind people cannot see objects in their surroundings, it would be helpful to know about them. Furthermore, there is a need to develop a tracking system through which family members of VIPs can monitor their movement.
Abstract:
Ā Ā Ā Ā Ā Ā Ā Ā Ā Visually impaired persons (VIPs) comprise a significant portion of the population, and they are present around the globe and in every part of the world. In recent times, technology proved its presence in every domain, and innovative devices assist humans in their daily lives. In this work, a smart and intelligent system is designed for VIPs to assist mobility and ensure their safety. The proposed system provides navigation in real-time using an automated voice. Though VIPs wouldnāt be able to see objects in their surroundings, they can sense and visualize the roaming environment. Moreover, a web-based application is developed to ensure their safety. The user of this application can turn the on-demand function for sharing his/her location with the family while compromising privacy. Through this application, the family members of VIPs would be able to track their movement (get location and snapshots) while being at their homes. Hence, the device allows VIPs to visualize the environment and ensure their security. Such a comprehensive device was a missing link in the existing literature. The application uses Mobile Net architecture due to its low computational complexity to run on low-power end devices. To assess the efficacy of the proposed system, six pilot studies have been performed that reflected satisfactory results. For object detection and recognition.
Existing System:Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā
Ā Ā Ā Ā Ā Ā Ā Ā Existing system providing Object detection, gsm used Emergency Call and Guardian monitoring features. This system needs separate microcontrollers for different operations like ESP32 module for object detection, external function system etc.
Proposed System:
Ā Ā Ā Ā Ā Proposed system can classify objects using pre-trained YOLO model, traffic light sign recognition, and human detection using YOLO algorithm and voice feedback. It uses Android mobile application for emergency notification and location sharing feature and image sharing feature. It uses GPS module for location tracking and store the location google sheet, then using esp32 cam for images taken and store in firebase IoT. All the functions are based on the audio guidance, which help of user mobile phone and reduce the data hear miss understanding, user compare with external audio device.
Reviews
There are no reviews yet.