MEMS Based Gesture Controlled Robot Using Wireless Communication
Citation
N.V.MaruthiSagar , D.V.R.SaiManikanta Kumar , N.Geethanjali. "MEMS Based Gesture Controlled Robot Using Wireless Communication", International Journal of Engineering Trends and Technology (IJETT), V14(4),185-188 Aug 2014. ISSN:2231-5381. www.ijettjournal.org. published by seventh sense research group
Abstract
The paper describes a robustness of MEMS based Gesture Controlled Robot is a kind of robot that can be by our hand gestures rather than an ordinary old switches or keypad. In Future there is a chance of making robots that can interact with humans in an natural manner. Hence our target interest is with hand motion based gesture interfaces. An innovative Formula for gesture recognition is developed for identifying the distinct action signs made through hand movement. A MEMS Sensor was used to carry out this and also an Ultrasonic sensor for convinced operation. In order to full-fill our requirement a program has been written and executed using a microcontroller system. Upon noticing the results of experimentation proves that our gesture formula is very competent and it’s also enhance the natural way of intelligence and also assembled in a simple hardware circuit.
Reference
[1] T. H. Speeter (1992), “Transformation human hand motion for tele manipulation,” Presence, 1, 1, pp. 63–79.
[2] S. Zhou, Z. Dong, W. J. Li, and C. P. Kwong (2008), “Hand-written character recognition using MEMS motion sensing technology,” in Proc. IEEE/ASME Int. Conf. Advanced Intelligent Mechatronics, pp.1418–1423.
[3] J. K. Oh, S. J. Cho, and W. C. Bang et al. (2004), “Inertial sensor based recognition of 3-D character gestures with an ensemble of classifiers,” presentedat the 9th Int. Workshop on Frontiers in Handwriting Recognition.
[4] W. T. Freeman and C. D. Weissman (1995) , “TV control by hand gestures, ”presented at the IEEE Int. Workshop on Automatic Face and Gesture Recognition, Zurich, Switzerland.
[5] L. Bretzner and T. Lindeberg(1998), “Relative orientation from extended sequences of sparse point and line correspondences using the affine trifocal tensor,” in Proc. 5th Eur. Conf. Computer Vision, Berlin, Germany,1406, Lecture Notes in Computer Science, pp.141–157, Springer Verlag.
[6] D. Xu (2006), “A neural network approach for hand gesture recognition in virtual reality driving training system of SPG,” presented at the 18th Int. Conf. Pattern Recognition.
[7] H. Je, J. Kim, and D. Kim (2007), “Hand gesture recognition to understand musical conducting action,” presented at the IEEE Int. Conf. Robot &Human Interactive Communication.
[8] T. Yang, Y. Xu, and A. (1994) , Hidden Markov Model for Gesture Recognition,CMU-RI-TR-94 10, Robotics Institute, Carnegie Mellon Univ.,Pittsburgh, PA.
[9] S. Zhou, Q. Shan, F. Fei, W. J. Li, C. P. Kwong, and C. K. Wu et al (2009)., “Gesture recognition for interactive controllers using MEMS motion sensors,” in Proc. IEEE Int. Conf. Nano /Micro Engineered and Molecular Systems, pp. 935–940.
[10] S. Zhang, C. Yuan, and V. Zhang (2008), “Handwritten character recognition using orientation quantization based on 3-D accelerometer,” presented at the 5th Annu. Int. Conf. Ubiquitous Systems.
[11] J. S. Lipscomb (1991), “A trainable gesture recognizer,” Pattern. Recognit.,24, 9, pp. 895–907.
[12] W. M. Newman and R. F. Sproull (1979), Principles of Interactive Computer Graphics. New York: McGraw-Hill.
[13] D. H. Rubine (1991), “The Automatic Recognition of Gesture,” Ph.D dissertation, Computer Science Dept., Carnegie Mellon Univ., Pittsburgh, PA.
[14] K. S. Fu, “Syntactic Recognition in Character Recognition”. New York: Academic, 1974, 112, Mathematics in Science and Engineering.
[15] S. S. Fels and G. E. Hinton(1993), “Glove-talk: A neural network interface between a data glove and a speech synthesizer,” IEEE Trans. Neural Netw., 4, l, pp. 2–8.
[16] C. M. Bishop(2006), Pattern Recognition and Machine Learning, 1st ed. New York: Springer.
[17] T. Schlomer, B. Poppinga, N. Henze, and S. Boll (2008), “Gesture recognition with a Wii controller,” in Proc. 2nd Int. Conf. Tangible and Embedded Interaction (TEI’08), Bonn, Germany, pp. 11–14.