Survey on Recurrent Neural Network in Natural Language Processing

  IJETT-book-cover  International Journal of Engineering Trends and Technology (IJETT)          
  
© 2017 by IJETT Journal
Volume-48 Number-6
Year of Publication : 2017
Authors : Kanchan M. Tarwani, Swathi Edem
DOI :  10.14445/22315381/IJETT-V48P253

Citation 

Kanchan M. Tarwani, Swathi Edem "Survey on Recurrent Neural Network in Natural Language Processing", International Journal of Engineering Trends and Technology (IJETT), V48(6),301-304 June 2017. ISSN:2231-5381. www.ijettjournal.org. published by seventh sense research group

Abstract
Natural Language Processing(NLP) is a way for computers to analyze, understand, and derive meaning from human language in a smarter way. Recurrent neural networks (RNN) have revolutionized the field of NLP. RNNs are used at modelling units in sequence. Unlike feed forward neural networks, RNNs have cyclic connections making them more powerful for modeling inputs of sequences. They have been successfully used for sequence labeling and sequence prediction tasks, such as handwriting recognition, language modeling, machine translation, phonetic labeling of acoustic frames and etc. This paper gives an overview of how RNNs are being used and capable of dealing with Natural Language Processing. This paper also summarizes LSTM based RNNs architectures.

 References

[1] A. Graves, M. Liwicki, S. Fernandez, R. Bertolami, H. Bunke, J. Schmidhuber. A Novel Connectionist System for Improved Unconstrained Handwriting Recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, no. 5, 2009.
[2] Jump up^ H. Sak and A. W. Senior and F. Beaufays., ?Long short-term memory recurrent neural network architectures for large scale acoustic modelling. Proc. Interspeech, pp338-342, Singapore, Sept. 2010
[3] Hochreiter and Schmidhuber , Long Short Term Memory, in the journal of Neural Computation 9(8):1735{1780, 1997
[4] Schmidhuber and Cummins , ?Learning to forget: Continual prediction with LSTM, in the Technical Report IDSIA-01-99 January, 1999.
[5] Grégoire Mesnil, Yann Dauphin, Kaisheng Yao, using recurrent neural networks for slot filling in spoken language understanding, IEEE/ACM Transactions on Audio, Speech, and Language Processing ( Volume: 23, Issue: 3, March 2015 )
[6] Andi Hermanto, Teguh Bharata Adji, Noor Akhmad Setiawan, ?Recurrent neural network language model for English-Indonesian Machine Translation: Experimental study IN Science in Information Technology (ICSITech), 2015 International Conference/IEEE transaction,( February 2016)
[7] Ha?im Sak, Andrew Senior, Françoise Beaufays, ?Long Short-Term Memory Based Recurrent Neural Network Architectures for Large Vocabulary Speech Recognition, arxiv.org, Feb 2014.
[8] Rijuka Pathak , Somesh Dewangan, Natural Language Chhattisgarhi: A Literature Survey, International Journal of Engineering Trends and Technology (IJETT) .(Volume -12, Number-2,2014).
[9] Mr. Roshan R. Karwa , Mr.M.B.Chandak , Word Sense Disambiguation: Hybrid Approach with Annotation Up To Certain Level – A Review, International Journal of Engineering Trends and Technology (IJETT) .(Volume -18, Number-7,2014).
[10] K.Brahmani , K.S.Roy , Mahaboob Ali, Arm 7 Based Robotic Arm Control By Electronic Gesture Recognition Unit Using Mems, International Journal of Engineering Trends and Technology (IJETT) .(Volume -4, Issue-4,2013).

Keywords
Recurrent Neural Network(RNN), Natural Language Processing(NLP), Back Propagation Through Time (BPTT), Long Short Term Memory (LSTM).