Optimizing Educational Interaction: An Advanced Real-Time Attention Recognition in Online Education Environment

Optimizing Educational Interaction: An Advanced Real-Time Attention Recognition in Online Education Environment

  IJETT-book-cover           
  
© 2024 by IJETT Journal
Volume-72 Issue-8
Year of Publication : 2024
Author : Namita Shinde, Mayur Dilip Jakhete, Shekhar Shinde, Archana Vyas, Rajesh Ramdas Karhe, Neeru Malik, Pooja Deshmukh, Kedarnath Chaudhary
DOI : 10.14445/22315381/IJETT-V72I8P135

How to Cite?
Namita Shinde, Mayur Dilip Jakhete, Shekhar Shinde, Archana Vyas, Rajesh Ramdas Karhe, Neeru Malik, Pooja Deshmukh, Kedarnath Chaudhary,"Optimizing Educational Interaction: An Advanced Real-Time Attention Recognition in Online Education Environment," International Journal of Engineering Trends and Technology, vol. 72, no. 8, pp. 378-388, 2024. Crossref, https://doi.org/10.14445/22315381/IJETT-V72I8P135

Abstract
It is important to recognize a student's emotional condition in both traditional and virtual learning settings. In order to solve this difficulty, this research suggests a novel method that uses patterns of eye and head movements to infer emotions and measure learner engagement. We emphasize the need for enhanced system efficacy, value, and user interaction by using sophisticated emotion recognition algorithms in our approach. Our goal is to assess how deeply a student is involved in educational activities. In online learning environments, the suggested system not only detects and tracks students' attention in real time but also sets up a feedback mechanism for improved material delivery. We measure the degree of focus exhibited by a student by closely examining their head and eye movements. The system uses graphs to classify and display this data, providing insightful information about student interest and involvement. Subsequently, this information is relayed to educators as feedback, enabling them to optimize the learning environment for a more tailored and effective educational experience.

Keywords
Image recognition, Face recognition, Education Tools, Convolutional neural networks, Computational intelligence.

References
[1] Qing Li et al., “A Learning Attention Monitoring System via Photoplethysmogram Using Wearable Wrist Devices,” Artificial Intelligence Supported Educational Technologies, Advances in Analytics for Learning and Teaching, pp. 133-150, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[2] Feng-Cheng Lin et al., “Student Behavior Recognition System for the Classroom Environment Based on Skeleton Pose Estimation and Person Detection,” Sensors, vol. 21, no. 16, pp. 1-20, 2021.
[CrossRef] [Google Scholar] [Publisher Link]
[3] Xin Zhang et al., “Analyzing Students’ Attention in Class Using Wearable Devices,” 2017 IEEE 18th International Symposium on a World of Wireless, Mobile and Multimedia Networks, Macau, China, pp. 1-9, 2017.
[CrossRef] [Google Scholar] [Publisher Link]
[4] Marcela Hernandez-de-Menendez, Carlos Escobar Díaz, and Ruben Morales-Menendez, “Technologies for the Future of Learning: State of the Art,” International Journal on Interactive Design and Manufacturing, vol. 14, pp. 683-695, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[5] Bui Ngoc Anh et al., “A Computer-Vision Based Application for Student Behavior Monitoring in Classroom,” Applied Science, vol. 9, no. 22, pp. 1-17, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[6] David M. Broussard et al., “An Interface for Enhanced Teacher Awareness of Student Actions and Attention in a VR Classroom,” 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, Lisbon, Portugal, pp. 284-290, 2021.
[CrossRef] [Google Scholar] [Publisher Link]
[7] Zhang Hong-yu et al., “Depth Image-Based Gesture Recognition for Multiple Learners,” Computer Science, vol. 42, no. 9, pp. 299-302, 2015.
[CrossRef] [Publisher Link]
[8] K. Nosu, and T. Kurokawa, “A Multi-Modal Emotion-Diagnosis System to Support E-Learning,” First International Conference on Innovative Computing, Information and Control - Volume I, Beijing, China, pp. 274-278, 2006.
[CrossRef] [Google Scholar] [Publisher Link]
[9] Kumiko Fujisawa, and Kenro Aihara, “Estimation of User Interest from Face Approaches Captured by Webcam,” Virtual Mixed Reality, Lecture Notes in Computer Science, vol. 5622, pp. 51-59, 2009.
[CrossRef] [Google Scholar] [Publisher Link]
[10] Liu Yuanyuan, “Research and Application of Head Pose Estimation Method in Natural Environment,” Ph.D Thesis, Central China Normal University, Wuhan, Hubei, China, pp. 1-98, 2015.
[Google Scholar] [Publisher Link]
[11] Liping Shen, Minjuan Wang, and Ruimin Shen, “Affective E-Learning: Using ‘Emotional’ Data to Improve Learning in Pervasive Learning Environment,” Educational Technology & Society, vol. 12, no. 2, pp. 176-189, 2009.
[Google Scholar] [Publisher Link]
[12] Yichuan Tang, “Deep Learning Using Linear Support Vector Machines,” arXiv, pp. 1-6, 2013.
[CrossRef] [Google Scholar] [Publisher Link]
[13] Hanh Phan-Xuan, Thuong Le-Tien, and Sy Nguyen-Tan, “FPGA Platform Applied for Facial Expression Recognition System Using Convolutional Neural Networks,” Procedia Computer Science, vol. 151, pp. 651-658, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[14] Mohammed Megahed, and Ammar Mohammed, “Modeling Adaptive E-learning Environment Using Facial Expressions and Fuzzy Logic,” Expert Systems with Applications, vol. 157, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[15] Omid Mohamad Nezami et al., “Automatic Recognition of Student Engagement using Deep Learning and Facial Expression,” Machine Learning and Knowledge Discovery in Databases, Lecture Notes in Computer Science, vol. 11908, pp. 273-289, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[16] Maritza Bustos-López et al., “Wearables for Engagement Detection in Learning Environments: A Review,” Biosensors, vol. 12, no. 7, pp. 1-30, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[17] Swadha Gupta, Parteek Kumar, and Rajkumar Tekchandani, “A Machine Learning-Based Decision Support System for Temporal Human Cognitive State Estimation during Online Education Using Wearable Physiological Monitoring Devices,” Decision Analytics Journal, vol. 8, pp. 1-16, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[18] Jayasankar Santhosh, David Dzsotjan, and Shoya Ishimaru, “Multimodal Assessment of Interest Levels in Reading: Integrating Eye-Tracking and Physiological Sensing,” IEEE Access, vol. 11, pp. 93994-94008, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[19] Kapil Sethi, and Varun Jaiswal, “PSU-CNN: Prediction of Student Understanding in the Classroom through Student Facial Images Using Convolutional Neural Network,” Materials Today Proceedings, vol. 62, no. 7, pp. 4957-4964, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[20] Aya Hassouneh, A.M. Mutawa, and M. Murugappan, “Development of a Real-Time Emotion Recognition System Using Facial Expressions and EEG Based on Machine learning and Deep Neural Network Methods,” Informatics in Medicine Unlocked, vol. 20, pp. 1-9, 2020.
[CrossRef] [Google Scholar] [Publisher Link]