Open Access Open Access  Restricted Access Subscription Access

Smart PPT Presentation: Using Hand Gestures

Dr. Devidas Thosar, Sanika Arkile, Triveeni Suryawansh, Shweta Lilhare, Renuka Vakhare

Abstract


Traditional computer systems depend heavily on physical input devices such as a mouse and keyboard for interaction. While these devices are effective, they have several limitations including hardware dependency, wear and tear, hygiene issues in shared environments, and limited accessibility for physically challenged users. This project presents a Hand Gesture Controlled System that enables users to interact with computers using hand movements captured through a webcam. The system uses Computer Vision and Machine Learning algorithms to detect, track, and interpret hand gestures in real-time. By recognizing specific gestures, the system can perform actions such as cursor movement, clicking, scrolling, and volume control without any physical contact. The proposed system provides a touchless, cost-effective, and user-friendly alternative to traditional input devices. It is especially useful in smart environments, public systems, medical facilities, and situations where maintaining hygiene is essential. This technology enhances accessibility and represents a step forward in Human-Computer Interaction (HCI).

 


Full Text:

PDF

References


G. Bradski, “The OpenCV Library,” Dr. Dobb’s Journal of Software Tools, vol. 25, no. 11, pp. 120–125, 2000.

F. Chollet, Deep Learning with Python. Shelter Island, NY, USA: Manning Publications, 2017.

I. Goodfellow, Y. Bengio, and A. Courville, Deep Learning. Cambridge, MA, USA: MIT Press, 2016.

R. Szeliski, Computer Vision: Algorithms and Applications. London, U.K.: Springer, 2010.

Z. Zhang, “Microsoft Kinect Sensor and Its Effect,” IEEE Multimedia, vol. 19, no. 2, pp. 4–10, 2012.

D. G. Lowe, “Distinctive Image Features from Scale-Invariant Keypoints,” International Journal of Computer Vision, vol. 60, no. 2, pp. 91–110, 2004.

A. Krizhevsky, I. Sutskever, and G. E. Hinton, “ImageNet Classification with Deep Convolutional Neural Networks,” in Proc. Advances in Neural Information Processing Systems (NIPS), 2012, pp. 1097–1105.

V. Pavlovic, R. Sharma, and T. S. Huang, “Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, no. 7, pp. 677–695, 1997.

Google Research, “MediaPipe Hands: Real-Time Hand Tracking,” 2020. [Online]. Available: https://mediapipe.dev

OpenCV, “OpenCV Documentation,” 2023. [Online]. Available: https://opencv.org


Refbacks

  • There are currently no refbacks.