A Comparative Analysis of Deep Belief Networks: Baseline and Optimized Architectures for Fine-Tuning
A Comparative Analysis of Deep Belief Networks: Baseline and Optimized Architectures for Fine-Tuning |
||
![]() |
![]() |
|
© 2025 by IJETT Journal | ||
Volume-73 Issue-8 |
||
Year of Publication : 2025 | ||
Author : Raji. N, S. Manohar | ||
DOI : 10.14445/22315381/IJETT-V73I8P110 |
How to Cite?
Raji. N, S. Manohar,"A Comparative Analysis of Deep Belief Networks: Baseline and Optimized Architectures for Fine-Tuning", International Journal of Engineering Trends and Technology, vol. 73, no. 8, pp.117-128, 2025. Crossref, https://doi.org/10.14445/22315381/IJETT-V73I8P110
Abstract
This study discusses the comprehensive analysis of Deep Belief Networks, which mainly focuses on comparing unoptimized and optimized architectures and comparing the performance on various datasets. Optimized DBN addresses some of the limitations in this regard by using advanced techniques like adaptive learning rates, regularization strategies, sparsity, and pruning to prevent problems like vanishing gradients, computational inefficiency, and sensitivity to hyperparameters. Empirical results show that the optimized DBN has considerable performance improvements, with an accuracy of over 98% against 92% for the unoptimized counterpart. All the metrics, including precision, recall, F1-score, AUC, and log-loss, also showed considerable gains. Evaluations across multiple datasets, including medical imaging and agricultural data, confirm the robustness and generalization of the optimized DBN. Moreover, K-fold validation emphasizes the stability of these improvements and demonstrates the optimized DBN’s strength in classification. This work further emphasizes the significance of optimization in improving the performance of DBNs and provides a framework for developing efficient and scalable deep learning models.
Keywords
Deep Belief Networks (DBNs), Optimized architectures, Hyperparameter sensitivity, Regularization strategies, K-fold validation.
References
[1] Bijaya Kumar Sethi et al., “Long Short-Term Memory-Deep Belief Network based Gene Expression Data Analysis for Prostate Cancer Detection and Classification,” IEEE Access, vol. 12, pp. 1508-1524, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[2] Mehmet Ali Balcı et al., “A Series-Based Deep Learning Approach to Lung Nodule Image Classification,” Cancers, vol. 15, no. 3, pp. 1-14, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[3] Nivaashini Mathappan, Suganya Elavarasan, and Sountharrajan Sehar, “Hybrid Intelligent Intrusion Detection System for Multiple Wi-Fi Attacks in Wireless Networks using Stacked Restricted Boltzmann Machine and Deep Belief Networks,” Concurrency and Computation: Practice and Experience, vol. 35, no. 23, pp. 1-27, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[4] Junhai Luo et al., “Semi-Supervised Cross-Subject Emotion Recognition Based on Stacked Denoising Autoencoder Architecture using a Fusion of Multi-Modal Physiological Signals,” Entropy, vol. 24, no. 5, pp. 1-29, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[5] Rini Smita Thakur et al., “Nature-Inspired DBN based Optimization Techniques for Image De-noising,” Intelligent Systems with Applications, vol.18, pp. 1-13, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[6] Vanlalruata Hnamte et al., “A Novel Two-Stage Deep Learning Model for Network Intrusion Detection: LSTM-AE,” IEEE Access, vol. 11, pp. 37131 - 37148, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[7] Shijie Ren, and Feng Zhou, “Semi-Supervised Classification for PolSAR Data with Multi-Scale Evolving Weighted Graph Convolutional Network,” IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 14, pp. 2911-2927, 2021.
[CrossRef] [Google Scholar] [Publisher Link]
[8] Sreenivasan Ramasamy Ramamurthy et al., “STAR-Lite: A Light-Weight Scalable Self-Taught Learning Framework for Older Adults’ Activity Recognition,” Pervasive and Mobile Computing, vol. 87, pp. 1-19, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[9] Luis Irastorza-Valera et al., “An Agent-Based Model to Reproduce the Boolean Logic Behaviour of Neuronal Self-Organised Communities through Pulse Delay Modulation and Generation of Logic Gates,” Biomimetics, vol. 9, no. 2, pp. 1-18, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[10] M.R. Ezilarasan, J. Britto Pari, and Man-Fai Leung, “Reconfigurable Architecture for Noise Cancellation in Acoustic Environment Using Single Multiply Accumulate Adaline Filter,” Electronics, vol. 12, no. 4, pp. 1-14, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[11] Denis Kleyko et al., “Perceptron Theory can Predict the Accuracy of Neural Networks,” IEEE Transactions on Neural Networks and Learning Systems, vol. 35, no. 7, pp. 9885-9899, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[12] Vita Santa Barletta et al., “A Kohonen SOM Architecture for Intrusion Detection on In-Vehicle Communication Networks,” Applied Sciences, vol.10, no.15, pp. 1-27, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[13] Meha Desai, and Manan Shah, “An Anatomization on Breast Cancer Detection and Diagnosis Employing Multi-Layer Perceptron Neural Network (MLP) and Convolutional Neural Network (CNN),” Clinical eHealth, vol. 4, pp. 1-11, 2021.
[CrossRef] [Google Scholar] [Publisher Link]
[14] Boyan Li et al., “Efficient Deep Spiking Multilayer Perceptrons with Multiplication-Free Inference,” IEEE Transactions on Neural Networks and Learning Systems, vol. 36, no. 4, pp. 7542-7554, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[15] Siyu Lu et al., “An Improved Algorithm of Drift Compensation for Olfactory Sensors,” Applied Sciences, vol. 12, no. 19, pp. 1-13, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[16] Joshua O. Ighalo, Adewale George Adeniyi, and Gonçalo Marques, “Application of Artificial Neural Networks in Predicting Biomass Higher Heating Value: An Early Appraisal,” Energy Sources, Part A: Recovery, Utilization, and Environmental Effects, vol. 46, no. 1, pp. 15117-15124, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[17] Wael Deabes, Alaa Sheta, and Malik Braik, “ECT-LSTM-RNN: An Electrical Capacitance Tomography Model-Based Long Short-Term Memory Recurrent Neural Networks for Conductive Materials,” IEEE Access, vol. 9, pp. 76325-76339, 2021.
[CrossRef] [Google Scholar] [Publisher Link]
[18] Roos Sophia de Freitas Dam et al., “Prediction of Fluids Volume Fraction and Barium Sulfate Scale in a Multiphase System Using Gamma Radiation and Deep Neural Network,” Applied Radiation and Isotopes, vol. 201, pp. 1-14, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[19] Luigi Tesio et al., “Interpreting Results from Rasch Analysis 2. Advanced Model Applications and the Data-Model Fit Assessment,” Disability and Rehabilitation, vol. 46, no. 3, pp. 604-617, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[20] Neha Ahlawat, and D. Franklin Vinod, “Clipped RBM and DBN Based Mechanism for Optimal Classification of Brain Cancer,” ICT with Intelligent Applications, Singapore: Springer Nature, vol. 1, pp. 295-304, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[21] Deliang Yu, and Huibo Zhang, “Fault Diagnosis Method for Submersible Reciprocating Pumping Unit Based on Deep Belief Network,” IEEE Access, vol. 8, pp. 109940-109948, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[22] Diego Marin-Santos et al., “Automatic Detection of Crohn Disease in Wireless Capsule Endoscopic Images Using a Deep Convolutional Neural Network,” Applied Intelligence, vol. 53, no. 10, pp. 12632-12646, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[23] Aiguo Chen et al., “An Efficient Network Behavior Anomaly Detection Using a Hybrid DBN-LSTM Network,” computers & security, vol. 114, pp. 1-19, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[24] Benyamin Abdollahzadeh et al., “Puma Optimizer (PO): A Novel Metaheuristic Optimization Algorithm and its Application in Machine Learning,” Cluster Computing, vol. 27, pp. 5235-5283, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[25] Tairu Oluwafemi Emmanuel, PlantVillage Dataset, Kaggle, 2018. [Online]. Available: https://www.kaggle.com/datasets/emmarex/plantdisease
[26] Prashant Patel, Chest X-ray (Covid-19 & Pneumonia), Kaggle, 2019. [Online]. Available: https://www.kaggle.com/datasets/prashant268/chest-xray-covid19-pneumonia
[27] K Scott Mader, Skin Cancer MNIST: HAM10000, Kaggle, 2018. [Online]. Available: https://www.kaggle.com/datasets/kmader/skin-cancer-mnist-ham10000
[28] UCI Machine Learning and 1 collaborator, Breast Cancer Wisconsin (Diagnostic) Data Set, Kaggle, 2016. [Online]. Available: https://www.kaggle.com/datasets/uciml/breast-cancer-wisconsin-data
[29] Mysar Ahmad Bhat, Lung Cancer, Kaggle, 2021. [Online]. Available: https://www.kaggle.com/datasets/mysarahmadbhat/lung-cancer
[30] Arnav Jain, Glaucoma Fundus Imaging Datasets, Kaggle, 2021. [Online]. Available: https://www.kaggle.com/datasets/arnavjain1/glaucoma-datasets
[31] Parisa Karimi Darabi, Bone Break Classification Image Dataset, Kaggle, 2025. [Online]. Available: https://www.kaggle.com/datasets/pkdarabi/bone-break-classification-image-dataset