Deep Web Content Mining for Personalized Web Search: An Application of Optimized Fuzzy Ensemble of CNN Models

Deep Web Content Mining for Personalized Web Search: An Application of Optimized Fuzzy Ensemble of CNN Models

  IJETT-book-cover           
  
© 2025 by IJETT Journal
Volume-73 Issue-10
Year of Publication : 2025
Author : Suruchi Chawla
DOI : 10.14445/22315381/IJETT-V73I10P118

How to Cite?
Suruchi Chawla,"Deep Web Content Mining for Personalized Web Search: An Application of Optimized Fuzzy Ensemble of CNN Models", International Journal of Engineering Trends and Technology, vol. 73, no. 10, pp.227-238, 2025. Crossref, https://doi.org/10.14445/22315381/IJETT-V73I10P118

Abstract
A novel design is proposed for Personalized web search based on web document classification using a Fuzzy ensemble of CNN models with Genetic Algorithm(GA) based hyperparameter optimization. The hyperparameter optimization of CNN is done using GA to improve the classification accuracy. The proposed ensemble model combines the advantages of ensemble and a Fuzzy weighted combination of deep learning models that classifies web documents into five classes, mainly politics, sports, technology, entertainment, and business. The performance of the fuzzy ensemble of CNN classifiers with GA based hyperparameter optimization was compared with other state-of-the-art models. The use of the proposed approach displays a significant improvement in the web document classification accuracy. The web documents classified based on the proposed method are grouped together in clusters for personalized web search, and the average precision of the search results is improved significantly.

Keywords
Ensemble Deep Learning, Genetic Algorithm(GA), Hyperparameters Optimization, Web document recommendation, classification, Convolution Neural Network (CNN), Personalized Web Search (PWS).

References
[1] Ammar Mohammed, and Rania Kora, “A Comprehensive Review on Ensemble Deep Learning: Opportunities and Challenges,” Journal of King Saud University-Computer and Information Sciences, vol. 35, no. 2, pp. 757-774, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[2] Zhu Hong, Jin Wenzhen, and Yang Guocai, “An Effective Text Classification Model based on Ensemble Strategy,” Journal of Physics: Conference Series, vol. 1229, no. 1, pp. 1-8, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[3] Lingxi Xie, and Alan Yuille, “Genetic CNN,” 2017 IEEE International Conference on Computer Vision (ICCV), Venice, Italy, pp. 1388-1397, 2017.
[CrossRef] [Google Scholar] [Publisher Link]
[4] Martin Wistuba, Nicolas Schilling, and Lars Schmidt-Thieme, “Hyperparameter Optimization Machines,” 2016 IEEE International Conference on Data Science and Advanced Analytics (DSAA), Montreal, QC, Canada, pp. 41-50, 2016.
[CrossRef] [Google Scholar] [Publisher Link]
[5] James Bergstra, and Yoshua Bengio, “Random Search for Hyper-Parameter Optimization,” Journal of Machine Learning Research, vol. 13, no. 2, pp. 281-305, 2012.
[Google Scholar] [Publisher Link]
[6] Rémi Bardenet et al., “Collaborative Hyperparameter Tuning,” Proceedings of the 30th International Conference on Machine Learning, PMLR, vol. 28, no. 2, pp. 199-207, 2013.
[Google Scholar] [Publisher Link]
[7] Erik Bochinski, Tobias Senst, and Thomas Sikora, “Hyper-Parameter Optimization for Convolutional Neural Network Committees Based on Evolutionary Algorithms,” 2017 IEEE International Conference on Image Processing (ICIP), Beijing, China, pp. 3924-3928, 2017.
[CrossRef] [Google Scholar] [Publisher Link]
[8] Humberto P´erez-Espinosa et al., “Tuning the Parameters of a Convolutional Artificial Neural Network by using Covering Arrays,” Research in Computing Science, vol. 121, pp. 69-81, 2016.
[Google Scholar] [Publisher Link]
[9] Xin Yao, and Y. Liu, “A New Evolutionary System for Evolving Artificial Neural Networks,” IEEE Transactions on Neural Networks, vol. 8, no. 3, pp. 694-713, 1997.
[CrossRef] [Google Scholar] [Publisher Link]
[10] Frank Hutter, Holger H. Hoos, and Kevin Leyton-Brown, “Sequential Model-Based Optimization for General Algorithm Configuration,” International Conference Learning and Intelligent Optimization, pp. 507-523, Berlin Heidelberg, pp. 507-523, 2011.
[CrossRef] [Google Scholar] [Publisher Link]
[11] Anwar Ali Yahya, Ramlan Mahmod, and Abd Rahman Ramli, “Dynamic Bayesian Networks and Variable Length Genetic Algorithm for Designing Cue-Based Model for Dialogue Act Recognition,” Computer Speech & Language, vol. 24, no. 2, pp. 190-218, 2010.
[CrossRef] [Google Scholar] [Publisher Link]
[12] Raúl Rojas, Neural Networks: A Systematic Introduction, 1st ed., Springer Nature Link, 1996.
[Google Scholar] [Publisher Link]
[13] Mario Juez-Gil et al., “Experimental Evaluation of Ensemble Classifiers for Imbalance in Big Data,” Applied Soft Computing, vol. 108, pp. 1-14, 2021.
[CrossRef] [Google Scholar] [Publisher Link]
[14] Olusola O. Abayomi-Alli et al., “An Ensemble Learning Model for Covid-19 Detection from Blood Test Samples,” Sensors, vol. 22, no. 6, pp. 1-18, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[15] Emmanuel Ileberi, Yanxia Sun, and Zenghui Wang, “A Machine Learning Based Credit Card Fraud Detection using the GA Algorithm for Feature Selection,” Journal of Big Data, vol. 9, no. 1, pp. 1-17, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[16] Xiong Kewei et al., “A Hybrid Deep Learning Model for Online Fraud Detectionm,” 2021 IEEE International Conference on Consumer Electronics and Computer Engineering (ICCECE), Guangzhou, China, pp. 431-434, 2021.
[CrossRef] [Google Scholar] [Publisher Link]
[17] Venkatachalam Kandasamy et al., “Sentimental Analysis of COVID-19 Related Messages in Social Networks by Involving an N-Gram Stacked Autoencoder Integrated in an Ensemble Learning Scheme,” Sensors, vol. 21, no. 22, pp. 1-15, 2021.
[CrossRef] [Google Scholar] [Publisher Link]
[18] Szu-Yin Lin, Yun-Ching Kung, and Fang-Yie Leu, “Predictive Intelligence in Harmful News Identification by BERT-Based Ensemble Learning Model with Text Sentiment Analysis,” Information Processing & Management, vol. 59, no. 2, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[19] Jianfan Chen, Zekai Li, and Shaoheng Qin, “Ensemble Learning for Assessing Degree of Humor,” 2022 International Conference on Big Data, Information and Computer Network (BDICN), Sanya, China, pp. 492-498, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[20] Ashnil Kumar et al., “An Ensemble of Fine-Tuned Convolutional Neural Networks for Medical Image Classification,” IEEE Journal of Biomedical and Health Informatics, vol. 21, no. 1, pp. 31-40, 2016.
[CrossRef] [Google Scholar] [Publisher Link]
[21] Bin Wang, Bing Xue, and Mengjie Zhang, “Particle Swarm Optimisation for Evolving Deep Neural Networks for Image Classification by Evolving and Stacking Transferable Blocks,” 2020 IEEE Congress on Evolutionary Computation (CEC), Glasgow, UK, pp. 1-8, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[22] Wentao Zhang et al., “Snapshot Boosting: A Fast Ensemble Framework for Deep Neural Networks,” Science China Information Sciences, vol. 63, no. 1, pp. 1-12, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[23] Peng Guo et al., “Ensemble Deep Learning for Cervix Image Selection toward Improving Reliability in Automated Cervical Precancer Screening,” Diagnostics, vol. 10, no. 7, pp. 1-13, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[24] Erdal Tasci, Caner Uluturk, and Aybars Ugur, “A Voting-Based Ensemble Deep Learning Method Focusing on Image Augmentation and Preprocessing Variations for Tuberculosis Detection,” Neural Computing and Applications, vol. 33, no. 22, pp. 15541-15555, 2021.
[CrossRef] [Google Scholar] [Publisher Link]
[25] Aditya Khamparia et al., “A Novel Deep Learning-Based Multi-Model Ensemble Method for the Prediction of Neuromuscular Disorders,” Neural Computing and Applications, vol. 32, no. 15, pp. 11083-11095, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[26] Junhao Zhang et al., “Grasp for Stacking via Deep Reinforcement Learning,” 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, pp. 2543-2549, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[27] David E. Goldberg, Genetic Algorithms in Search Optimization and Machine Learning, Reading, Addison-Wesley, MA, 1989.
[Google Scholar] [Publisher Link]
[28] Zewen Li et al., “A Survey of Convolutional Neural Networks: Analysis, Applications, and Prospects,” IEEE Transactions on Neural Networks and Learning Systems, vol. 33, no. 12, pp. 6999-7019, 2021.
[CrossRef] [Google Scholar] [Publisher Link]
[29] Entesar Hamed I. Eliwa et al., “Utilizing Convolutional Neural Networks to Classify Monkey Pox Skin Lesions,” Scientific Reports, vol. 13, no. 1, pp. 1-20, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[30] Muhammed Celik, and Ozkan Inik, “Development of Hybrid Models based on Deep Learning and Optimized Machine Learning Algorithms for Brain Tumor Multi-Classification,” Expert Systems with Applications, vol. 238, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[31] Joseph Redmon, and Ali Farhadi, “Yolov3: An Incremental Improvement,” arXiv Preprint, pp. 1-6, 2018.
[CrossRef] [Google Scholar] [Publisher Link]
[32] Kaiming He et al., “Mask R-CNN,” Proceedings of the IEEE International Conference on Computer Vision (ICCV), pp. 2961-2969, 2017.
[Google Scholar] [Publisher Link]
[33] Özkan İnik et al., “A New Method for Automatic Counting of Ovarian Follicles on Whole Slide Histological Images based on Convolutional Neural Network,” Computers in Biology and Medicine, vol. 112, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[34] Olaf Ronneberger, Philipp Fischer, and Thomas Brox, “U-Net: Convolutional Networks for Biomedical Image Segmentation,” International Conference on Medical Image Computing and Computer-Assisted Intervention, Munich, Germany, pp. 234-241, 2015.
[CrossRef] [Google Scholar] [Publisher Link]
[35] Jiazhuo Wang, Jason Xu, and Xuejun Wang, “Combination of Hyperband and Bayesian Optimization for Hyperparameter Optimization in Deep Learning,” arXiv Preprint, pp. 1-10, 2018.
[CrossRef] [Google Scholar] [Publisher Link]
[36] Stefan Falkner, Aaron Klein, and Frank Hutter, “Practical Hyperparameter Optimization for Deep Learning, Openreview.net, 2018.
[Google Scholar] [Publisher Link]
[37] Ruoyu Sun, “Optimization for Deep Learning: Theory and Algorithms,” arXiv Preprint, pp. 1-60, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[38] Ruo-Yu Sun, “Optimization for Deep Learning: An Overview,” Journal of the Operations Research Society of China, vol. 8, no. 2, pp. 249-294, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[39] James Martens, “Deep Learning via Hessian-Free Optimization,” Proceedings of the 27th International Conference on International Conference on Machine Learning, vol. 27, pp. 735-742, 2010.
[Google Scholar] [Publisher Link]
[40] Mykel J. Kochenderfer, and Tim A. Wheeler, Algorithms for Optimization, Mit Press, 2019.
[Google Scholar] [Publisher Link]
[41] Stefan Falkner, Aaron Klein, and Frank Hutter, “Bohb: Robust and Efficient Hyperparameter Optimization at Scale,” Proceedings of the 35th International Conference on Machine Learning, PMLR, vol. 80, pp. 1437-1446, 2018.
[Google Scholar] [Publisher Link]
[42] Lisha Li et al., “Parallelizing Hyperband for Large-Scale Tuning,” SysML, 2018.
[Google Scholar]
[43] Liam Li et al., “A System for Massively Parallel Hyperparameter Tuning,” arXiv Preprint, pp. 1-17, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[44] Benteng Ma et al., “Autonomous Deep Learning: A Genetic DCNN Designer for Image Classification,” Neurocomputing, vol. 379, pp. 152-161, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[45] Pablo Ribalta Lorenzo et al., “Particle Swarm Optimization for Hyper-Parameter Selection in Deep Neural Networks,” Proceedings of the Genetic and Evolutionary Computation Conference, pp. 481-488, 2017.
[CrossRef] [Google Scholar] [Publisher Link]
[46] Yanan Sun et al., “A Particle Swarm Optimization-Based Flexible Convolutional Autoencoder for Image Classification,” IEEE Transactions on Neural Networks and Learning Systems, vol. 30, no. 8, pp. 2295-2309, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[47] Yanan Sun et al., “Evolving Deep Convolutional Neural Networks for Image Classification,” IEEE Transactions on Evolutionary Computation, vol. 24, no. 2, pp. 394-407, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[48] Francisco Erivaldo Fernandes Junior, and Gary G. Yen, “Particle Swarm Optimization of Deep Neural Networks Architectures for Image Classification,” Swarm and Evolutionary Computation, vol. 49, pp. 62-74, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[49] Bin Wang et al., “Evolving Deep Convolutional Neural Networks by Variable-Length Particle Swarm Optimization for Image Classification,” 2018 IEEE Congress on Evolutionary Computation (CEC), Rio de Janeiro, Brazil, pp. 1-8, 2018.
[CrossRef] [Google Scholar] [Publisher Link]
[50] Masanori Suganuma et al., “A Genetic Programming Approach to Designing Convolutional Neural Network Architectures,” Proceedings of the Genetic and Evolutionary Computation Conference, Berlin, Germany, pp. 497-504, 2017.
[CrossRef] [Google Scholar] [Publisher Link]
[51] Hanxiao Liu, Karen Simonyan, and Yiming Yang, “Darts: Differentiable Architecture Search,” arXiv Preprint, pp. 1-13, 2018.
[CrossRef] [Google Scholar] [Publisher Link]
[52] Risto Miikkulainen et al., “Evolving Deep Neural Networks,” Artificial Intelligence in the Age of Neural Networks and Brain Computing, pp. 269-287, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[53] Özkan İnik, “CNN Hyper-Parameter Optimization for Environmental Sound Classification,” Applied Acoustics, vol. 202, 2023.
[CrossRef] [Google Scholar] [Publisher Link]