AIM:
Ā Ā Ā Ā Ā Ā Ā The mainstay of the project is to design and develop a wheelchair that uses eye movement with the help of camera for moving the wheel chair by the patient.
Introduction:
Ā Ā Ā Ā Ā Ā In recent years, advancements in computer vision and embedded systems have paved the way for innovative assistive technologies, significantly improving the quality of life for individuals with severe motor disabilities. This project introduces an eye-controlled wheelchair system that leverages the capabilities of Mediapipe and OpenCV libraries, combined with the processing power of a Raspberry Pi running Raspbian OS. By using a camera to capture real-time video of the user’s face, the system detects and analyzes eye movements to determine directional commandsāleft, right, and forwardāenabling hands-free control of the wheelchair. The integration of DC motors ensures responsive movement, while an LCD screen provides real-time status updates. An ultrasonic sensor adds a critical layer of safety by detecting obstacles and preventing collisions. The core of the system lies in its ability to segment and analyze eye regions, using the distribution of black pixels to infer gaze direction. This novel approach not only enhances mobility for users but also emphasizes accessibility and ease of use, showcasing the potential of combining cutting-edge computer vision technology with practical, real-world applications in assistive devices. This project represents a significant step towards greater independence for individuals with disabilities, demonstrating the transformative impact of modern technology on everyday life.
.
Existing system:
Ā Ā Ā Ā Ā Ā Ā Ā In the existing system, wheelchair control systems rely on manual input devices like joysticks and sip-and-puff mechanisms, posing challenges for users with severe motor impairments. Alternatives such as head motion tracking and voice control offer more accessibility but are hindered by issues like environmental noise and the need for clear articulation. Eye-tracking systems using infrared sensors require expensive hardware and can be sensitive to lighting and calibration. These systems often lack integrated safety features like obstacle detection, making them less practical. This project aims to provide a cost-effective, user-friendly solution, combining eye movement tracking with built-in obstacle detection for enhanced usability and safety
Ā
Proposed System:
Ā Ā Ā Ā Ā Ā Ā The proposed system revolutionizes wheelchair control by integrating eye-tracking technology with real-time processing on a Raspberry Pi running Raspbian OS. Utilizing Mediapipe and OpenCV libraries, the system captures and analyzes eye movements through a camera, translating them into directional commandsāleft, right, and forwardāenabling intuitive hands-free operation of the wheelchair. DC motors are employed to execute these commands, providing precise movement control. To enhance safety, an ultrasonic sensor continuously monitors for obstacles, automatically halting the wheelchair to prevent collisions. An LCD screen displays real-time updates on movement direction and obstacle detection, ensuring clear communication with the user. Eye position is determined by segmenting the eye image into three parts and analyzing the distribution of black pixels to assess gaze direction, allowing for accurate tracking under varied lighting conditions with minimal calibration. This approach addresses the limitations of existing systems, which often rely on manual input devices or expensive, specialized hardware. By offering a cost-effective, user-friendly, and integrated solution, the proposed system significantly improves mobility and independence for individuals with severe motor disabilities, demonstrating the potential of modern computer vision technology in enhancing assistive devices.
Reviews
There are no reviews yet.