

Human Machine Interaction Based on Hand Tapping Gestures
Abstract
There have been a lot of studies on the text input system using the image-based hand gesture recognition. However, hand gesture languages such as sign languages, finger alphabets, and aerial handwriting treated in the previous works have some problems to be commonly used. The aerial handwriting requires much time for writing and recognition. The sign languages and finger alphabets demand quite a knowledge and practice for using it, which results in restricting the number of their users. As a solution to the problems, this paper proposes a new character input system based on hand tapping gestures for Japanese hiragana and English characters that can be used to facilitate human-computer interaction. The hand tapping gestures are motions for tapping keys on aerial virtual keypads by hands, which can be effectively used as a hand alphabet by anyone including hearing impaired individuals. The users can effectively interact with computers by using our non-touch input system where only the Kinect sensor is used without any keyboard, mouse or body-worn device. We expect that our character input system will open a new channel for human-computer interaction.
Keywords: Non-touch character input, hand gesture, fingertip detection, kinect sensor
References
Ministry of Health, Labour and Welfare. The result of research on such of 2011 life (nationwide home children with disabilities, and others Survey). 2013. (in Japanese)
Tanaka S., Takuma M., Tsukada Y. Research Concerning Recognition of Finger Spelling Using Range Image Sensor. 76th National Convention of IPSJ, vol. 76, no. 2, 2275–2276p, 2014.
Miyake T., Wakatuki D., Naito I. A Basic Study on Recognizing Fingerspelling with Hand Movements by the Use of Depth Image. Technology report of Tsukuba University, vol. 20, no. 1, 7–13p, 2012.
Takabayashi D., Ohkawa Y., Setoyama K., Tanaka Y. Training system for learning finger alphabets with feedback functions. The Institute of Electronics, Information and Communication Engineers, Technical Report of IEICE, Vol. 112, No. 483, HIP2012-90, 79–84p, 2013.
Maebatake M., et al. A Study on Sign Language Recognition based on Gesture Components of Position and Movement. Workshop on Interactive Systems and Software (WISS2007), Japan Society for Software Science and Technology, 129–130p, 2007.
Sato A., Sginoda K., Furui S. A Study on 3D Sign Language Recognition Using Time of Flight Camera. Meeting on Image Recognition and Understanding (MIRU2010), vol. 3, no. 44, 1861–1868p, 2010.
Nishimura Y., et al. HMM Sign Language Recognition Using Kinect and Particle Filter. The Institute of Electronics, Information and science.
Fujii Y., et al. An Aerial Handwritten Character Input System. IPSJ Journal (MBL), vol. 50, no. 6, 1–4p, 2009.
Murata T., Shin J. Hand Gesture and Character Recognition Based on Kinect Sensor. International Journal of Distributed Sensor Networks, Vol. 2014, Article ID 278460, July 2014. http://dx.doi.org/10.1155/2014/278460
Tanaka Y. Training system for learning finger alphabets with feedback functions. Master Thesis, Faculty of Industrial Technology, Tsukuba University of Technology, 2014.
Suzuki N., Oohashi K., Horiguchi S. Performance evaluation of finger character learning support system using a vibrator with the hand shape input device. Research report, Japan Advanced Institute of Technology.
Refbacks
- There are currently no refbacks.