Hand Gesture Recognition for Real Time Human Machine Interaction System

  IJETT-book-cover  International Journal of Engineering Trends and Technology (IJETT)          
  
© 2015 by IJETT Journal
Volume-19 Number-5
Year of Publication : 2015
Authors : Poonam Sonwalkar, Tanuja Sakhare, Ashwini Patil, Sonal Kale
DOI :  10.14445/22315381/IJETT-V19P245

Citation 

Poonam Sonwalkar, Tanuja Sakhare, Ashwini Patil, Sonal Kale "Hand Gesture Recognition for Real Time Human Machine Interaction System", International Journal of Engineering Trends and Technology (IJETT), V19(5), 262-264 Jan 2015. ISSN:2231-5381. www.ijettjournal.org. published by seventh sense research group

Abstract

Real Time Human-machine Interaction system using hand gesture Recognition to handle the mouse event , media player , image viewer .Users have to repeat same mouse and keyboard actions, inducing waste of time. Gestures have long been considered as an interaction technique that can potentially deliver more natural. A fast gesture recognition scheme is proposed to be an interface for the human-machine interaction (HMI) of systems. The system presents some low-complexity algorithms and gestures to reduce the gesture recognition complexity and be more suitable for controlling real-time computer systems. In this paper we use the webcam for capturing the image. After capturing the image it converts into the binary image. A gesture is a specific combination of hand position.

References

[1] Y. Wu, T. Huang, Vision-based gesture recognition: a review, in gesture-based communications in HCI, Lecture Notes in Computer Science, Vol. 1739, Springer, Berlin, pp. 103–115, 1999.
[2] C. Pickering, K. Burnham, and M. Richardson, A research study of hand gesture recognition Technologies and Applications for Human Vehicle Interaction, in Proc. of the 3rd Institution of Engineering and Technology Conference on Automotive Electronics, pp. 1–15, 2007.
[3] W. Wierwille, Visual and manual demands of in car controls and displays. In Automotive Ergonomics, Ed. by Peacock, B.Karwowski, B., Taylor and Francis, pp. 299–313, 1993.
[4] C. Pickering, The search for a safer driver interface: a review of gesture recognition human machine interface, Computing & Control Engineering Journal, Vol. 16, pp. 34–40, 2005.
[5] L., Wang, W. Hu, and T. Tan, Recent developments in human motion analysis, Pattern Recognition, Vol. 36, pp. 585-601, 2003.
[6] K. Imagawa, S. Lu, and S. Igi, Color-based hands tracking system for sign language recognition, in Proc. of the Third IEEE International Conference on Automatic Face and Gesture Recognition, pp. 462–467, 1998.
[7] L. Bretzner, I. Laptev1, and T. Lindeberg, Hand gesture recognition using multi-scale colour features, Hierarchical Models and Particle Filtering, in Proc. of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition, pp. 423, 2002.
[8] A. Yilmaz, and M. Shah, Actions as objects: a novel action representation, in Proc. of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2005.
[9] A. El-Sawah, N. D. Georganas, and E. M. Petriu, A Prototype for 3D Hand Tracking and Posture Estimation, IEEE Transactions on Instrumentation and Measurement, Vol. 57, pp. 1627–1636, 2008.
[10] W. Lu, and J. Little, Tracking and recognizing actions at a distance, in Proc. of the ECCV Workshop on Computer Vision Based Analysis in Sport Environments, pp. 49–60, 2006.
[11] W. James and J. Little, Simultaneous tracking and action recognition using the PCA-HOG descriptor, in Proc. of the Third Canadian Conference on Computer and Robot Vision, pp. 6, 2006.
[12] L. Gorelick, M. Blank, E. Shechtman, M. Irani, and R. Basri, Actions as space-time shapes, IEEE Trans. Pattern Analysis and Machine Intelligence Vol. 29, No. 12, pp. 2247–2253, 2007.
[13] S. Calderara, R. Cucchiara, and A. Prati, Action signature: A novel holistic representation for action recognition, in Proc. of theInternational Conference on Advanced Video and Signal Based Surveillance, IEEE Computer Society Press, Washington, pp. 121128, 2008.
[14] C. Schuldt, I. Laptev, and B. Caputo, Recognizing human actions: a local svm approach, in Proc. of the International Conference on Pattern Recognition, Vol. 3, IEEE Computer Society Press, Cambridge, UK, pp. 32–36, 2004.
[15] P. Scovanner, S. Ali., and M. Shah, A 3-dimensional sift descriptorand its application to action recognition, in Proc. of the 15th international conference on Multimedia, ACM, pp. 357–360, 2007.
[16] J. Liu, and M. Shah, Learning human actions via information maximization, in Proc. of the International Conference on Computer Vision and Pattern Recognition, IEEE Computer Society Press, Anchorage, Alaska, pp. 1–8, 2008.
[17] Q. Luo, X. Kong, G. Zeng, and J. Fan, Human action detection via boosted local motion histograms, Machine Vision and Applications, Vol. 21, pp. 377–389, 2010.
[18] N. Otsu, A threshold selection method from gray-level histograms, IEEE Trans. Sys., Man., Cyber. Vol. 9, pp. 62–66, 1979.
[19] M. B. Dillencourt and H. Samet and M. Tamminen, A general approach to connected-component labeling for arbitrary image representations, Journal of the ACM, Vol. 39, pp. 253–280, 1992.

Keywords
Gesture Recognition, Human Machine Interaction System, Webcam.