Gesture Controlled Virtual Mouse using Hand Gesture and Eye Movement
Abstract
Human–computer interaction is evolving beyond traditional input devices like a mouse and keyboard. Gesture Controlled Virtual Mouse using Hand Gesture and Eye Movement is an innovative project designed to provide a more natural, touch-free, and accessible way to interact with computers. The system uses a camera to detect hand gestures and eye movements, which are processed using computer vision and AI techniques to perform mouse operations such as cursor movement, clicking, scrolling, and dragging. This approach is especially beneficial for people with physical disabilities and in situations where touch-based interaction is inconvenient or unhygienic. By combining hand gesture recognition with eye tracking, the project improves accuracy and user control. Overall, this virtual mouse system demonstrates how intelligent technologies can create intuitive, efficient, and inclusive human–computer interfaces.
References
Szeliski, R. (2010). Computer Vision: Algorithms and Applications. Springer Publications.
Bradski, G., & Kaehler, A. (2016). Learning OpenCV 3: Computer Vision in C++ with the OpenCV Library. O’Reilly Media.
IEEE Xplore Digital Library. Papers on Hand Gesture Recognition and Vision-Based Human-Computer Interaction. Available at: https://ieeexplore.ieee.org
ResearchGate. (2020–2024). Research papers on Virtual Mouse using Hand Gesture Recognition and Eye Tracking Systems. Available at: https://www.researchgate.net
Media Pipe Documentation. Hand and Face Landmark Detection Models. Availableat: https://developers.google.com/mediapipe
OpenCV Documentation. Image Processing and Computer Vision Library. Available at: https://opencv.org
PyAutoGUI Documentation. Python Automation and Mouse Control Library. Available at: https://pyautogui.readthedocs.io
Refbacks
- There are currently no refbacks.