Open Access Open Access  Restricted Access Subscription Access

Gesture Steered Computer Using Python, Arduino and Ultrasonic Sensors

G. Theivanathan, G. Subramanian, V.P. Kavitha, V. Magesh

Abstract


Gesture recognition is a technology that is becoming more and more pertinent, given the recent growth of VR (Virtual Reality) and AR (Augmented Reality) technologies. It is one key essence to HCI (Human Computer Interaction), permit to recognize two-way communication in virtual spaces. Since the time that computer interaction with human started, there is a need to improve human interaction with computer. Computers now became intrinsic part and hence their usage should be as trouble-free. Either keyboard or a mouse was used with this smart machine as a rudimentary way of interaction. But now attempts are being made to make the man-computer interaction as natural as possible. By fulfilling these requirements the popular touch screens technology which is soon expected to be replaced by the gesture recognition technology. So, the aim of this gesture steered computer is to control any computer using ultrasonic sensors. In this Arduino is used to interact between ultrasonic sensor and computer. It is a very straightforward concept. The interspace between the hand and monitor is sensed by an ultrasonic sensor kept above the monitor. Depending on the estimated gap, the function required to perform on the computer can be assigned.



Full Text:

PDF

References


Karray, F., Alemzadeh, M., Abou Saleh, J., & Arab, M. N. (2008). Human-computer interaction: Overview on state of the art. International journal on smart sensing and intelligent systems, 1(1), 137.

LaViola Jr, J. J. (1999). A survey of hand posture and gesture recognition techniques and technology.

Zelle, J. M. (2004). Python programming: an introduction to computer science. Franklin, Beedle & Associates, Inc..

Kalgaonkar, K., & Raj, B. (2009, April). One-handed gesture recognition using ultrasonic Doppler sonar. In 2009 IEEE International Conference on Acoustics, Speech and Signal Processing (pp. 1889-1892). IEEE.

Tarzia, S. P., Dick, R. P., Dinda, P. A., & Memik, G. (2009, September). Sonar-based measurement of user presence and attention. In Proceedings of the 11th international conference on Ubiquitous computing (pp. 89-92).

Watanabe, H., Terada, T., & Tsukamoto, M. (2013, September). Ultrasound-based movement sensing, gesture-, and context-recognition. In Proceedings of the 2013 International Symposium on Wearable Computers (pp. 57-64).

Renuka, H., & Goutam, B. (2014). Hand gesture recognition system to control soft front panels. International Journal of Engineering Research & Technology (IJERT), 3(12), 5-10.

Birk, H., & Moeslund, T. B. (1996). Recognizing gestures from the hand alphabet using principal component analysis. Aalborg Universitet.


Refbacks

  • There are currently no refbacks.