Gesture Based Interaction NUI: An Overview

  IJETT-book-cover  International Journal of Engineering Trends and Technology (IJETT)          
  
© 2014 by IJETT Journal
Volume-9 Number-12
Year of Publication : 2014
Authors : Dr. Manju Kaushik , Rashmi Jain
  10.14445/22315381/IJETT-V9P319

Citation 

Dr. Manju Kaushik , Rashmi Jain. "Gesture Based Interaction NUI: An Overview", International Journal of Engineering Trends and Technology (IJETT), V9(12),633-636 March 2014. ISSN:2231-5381. www.ijettjournal.org. published by seventh sense research group

Abstract

Touch, face, voice-recognition and movement sensors – all are part of an emerging field of computing often called natural user interface, or NUI. Interacting with technology in these humanistic ways is no longer limited to high-tech secret agents. Gesture recognition is the process by which gestures formed by a user are made known to the system. In completely immersive VR environments, the keyboard is generally not included, Technology incorporates face, voice, gesture, and object recognition to give users a variety of ways to interact with the console, all without needing a controller. This paper focuses on the emerging way of human computer interaction, Gesture recognition concept and gesture types.

References

[1] Pavlovic Vladimir I, Sharma Rajeev and Haun Thomas S, Visual Interpretation of Hand Gestures for Human –Computer Interaction: A Review, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol 19,No 7,July 1997.
[2] J.A Adam, ” Virtual Reality,” IEEE Spectrum, Vol 30,No 10,pp 22-29,1993.
[3] H.Rheingold, Virtual reality , Summit Books,1991.
[4] A.G. Hauptmann and P. McAvinney, “Gesture With Speech for Graphics Manipulation,” Int’l J. Man-Machine Studies, vol. 38, pp. 231-249, Feb. 1993.
[5] Siddharth S. Rautaray • Anupam Agrawal, “Vision based hand gesture recognition for human computer interaction: a survey” , Springer Science
[6] C.L Lisetti and D.J Schiano, ”Automatic classification of single facial images,” Pragmatics Cogn., vol. 8.pp. 185-235 ,2000
[7] MitraSushmita ,Tinku Acharya , “Gesture Recognition :A survey “,IEEE Transactions on systems, vol.37. No 3,May 2007
[8] T. Baudel and M. Baudouin-Lafon, “Charade: Remote Control of Objects Using Free-Hand Gestures,” Comm. ACM, vol. 36, no. 7, pp. 28-35, 1993.
[9] S.S. Fels and G.E. Hinton, “Glove-Talk: A Neural Network Interface between a Data-Glove and a Speech Synthesizer,” IEEE Trans. Neural Networks, vol. 4, pp. 2-8, Jan. 1993.
[10] D.J. Sturman and D. Zeltzer, “A Survey of Glove-Based Input,” IEEE Computer Graphics and Applications, vol. 14, pp. 30-39, Jan. 1994.
[11] D.L. Quam, “Gesture Recognition With a DataGlove,” Proc. 1990 IEEE National Aerospace and Electronics Conf., vol. 2, 1990.
[12] C. Wang and D.J. Cannon, “A Virtual End-Effector Pointing System in Point-and-Direct Robotics for Inspection of Surface Flaws Using a Neural Network-Based Skeleton Transform,” Proc. IEEE Int’l Conf. Robotics and Automation, vol. 3, pp. 784-789, May 1993.
[13] A. Samal and P. Iyengar, “Automatic recognition and analysis of human faces and facial expressions,” Pattern Recogn., vol. 25, pp. 65–77, 1992.
[14] J. Daugman, “Face and gesture recognition: An overview,” IEEE Trans.Pattern Anal. Mach. Intell., vol. 19, no. 7, pp. 675–676, Jul. 1997
[15] P. Hong, M. Turk, and T. S. Huang, “Gesture modeling and recognition using finite state machines,” in Proc. 4th IEEE Int. Conf. Autom. Face Gesture Recogn., Grenoble, France, Mar. 2000, pp. 410–415.
[16] M. Pantic and L. J. M. Rothkranz, “Automatic analysis of facial expressions: The state of the art,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 22, no. 12, pp. 1424–1445, Dec. 2000.
[17] F. Samaria and S. Young, “HMM-based architecture for face identification,” Image Vis. Comput., vol. 12, pp. 537–543, 1994.
[18] M. Kass, A. Witkin, and D. Terzopoulos, “SNAKE: Active contour models,” Int. J. Comput. Vis., pp. 321–331, 1988.
[19] M. J. Lyons, J. Budynek, and S. Akamatsu, “Automatic classification of single facial images,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 21, no. 12, pp. 1357–1362, Dec. 1999.
[20] Y. L. Tian, T. Kanade, and J. F. Cohn, “Recognizing action units for facial expression analysis,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 23, no. 2, pp. 97–115, Feb. 2001.
[21] P. Ekman and W. V. Friesen, Facial Action Coding System (FACS): Manual. Palo Alto: Consulting Psychologists Press, 1978.
[22] M. Rosenblum, Y. Yacoob, and L. S. Davis, “Human expression recognition from motion using a radial basis function network architecture,” IEEE Trans. Neural Netw., vol. 7, no. 5, pp. 1121–1138, Sep. 1996.
[23] K. K. Sung and T. Poggio, “Learning human face detection in clutteredscenes,” in Proc. IEEE 6th Int. Conf. CAIP’95, Prague, Czech Republic, 1995, pp. 432–439.
[24] I. Essa and A. Pentland, “Coding, analysis, interpretation, recognition of facial expressions,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 19, no. 7, pp. 757–736, Jul. 1997.
[25] M. Turk and A. Pentland, “Eigenfaces for recognition,” J. Cogn. Neurosci., vol. 3, pp. 71–86, 1991.
[26] M. Rosenblum, Y. Yacoob, and L. S. Davis, “Human expression recognition from motion using a radial basis function network architecture,” IEEE Trans. Neural Netw., vol. 7, no. 5, pp. 1121–1138, Sep. 1996
[27] K. Mase, “Recognition of facial expression from optical flow,” IEICE Trans., vol. E 74, pp. 3474–3483, 1991.
[28] Y. Yacoob and L. Davis, “Recognizing human facial expressions from long image sequences using optical flow,” IEEE Trans. Pattern Anal.Mach. Intell., vol. 18, no. 6, pp. 636–642, Jun. 1996.
[29] Y. Dai, Y. Shibata, T. Ishii, K. Hashimoto, K. Katamachi, K. Noguchi, N. Kakizaki, and D. Cai, “An associate memory model of facial expression and its applications in facial expression recognition of patients on bed,” in Proc. IEEE Int. Conf. Multimedia Expo. Aug. 22–25, 2001,pp. 772–775.
[30] R. Brunelli and T. Poggio, “Face recognition: Features versus templates,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 15, no. 10, pp. 1042–1052, Oct. 1993.

Keywords
Natural User Interface, Gestures Recognition, Human Computer Interaction.