International Journal of Engineering
Trends and Technology

Research Article | Open Access | Download PDF
Volume 74 | Issue 1 | Year 2026 | Article Id. IJETT-V74I1P124 | DOI : https://doi.org/10.14445/22315381/IJETT-V74I1P124

Performance Enhancement of GRU-Based Corn Yield Forecasting through MICE Imputation, PCA-Based Feature, and ELU Activation


Lyra K. Nuevas, Marvee Cheska B. Natividad

Received Revised Accepted Published
15 Aug 2025 06 Jan 2026 08 Jan 2026 14 Jan 2026

Citation :

Lyra K. Nuevas, Marvee Cheska B. Natividad, "Performance Enhancement of GRU-Based Corn Yield Forecasting through MICE Imputation, PCA-Based Feature, and ELU Activation," International Journal of Engineering Trends and Technology (IJETT), vol. 74, no. 1, pp. 307-316, 2026. Crossref, https://doi.org/10.14445/22315381/IJETT-V74I1P124

Abstract

Accurate corn yield forecasting is significant for farm agriculture to improve both farm production and productivity. In agriculture, forecasting is important to ensure food security. This study presents an improved Gated Recurrent Unit (GRU) forecasting approach that combines Multiple Imputation by Chained Equations (MICE), Principal Component Analysis (PCA), and Exponential Linear Unit (ELU) activation functions. This study is novel because it combines the PCA and ELU in a GRU architecture for agricultural forecasting. Previous study shows little to no work exploring this specific approach, or it is rarely seen in existing literature. MICE accurately imputes missing agronomic information, and PCA deals with multicollinearity and reduction of feature dimensions. This optimized input enhances gradient flow during training and mitigates the vanishing gradient issue common in deep recurrent models. Moreover, the application of the ELU activation stabilizes learning as it keeps small gradient values. The experiments showed that the model trained with dimensionality reduction with PCA, ELU activation, enhances performance with much higher accuracy than the baseline GRU models. The result produces fewer forecasting errors and consistent results with the actual yield values. These demonstrate that using the data imputation method combined with the ELU activation function enhances the performance of deep learning models in corn yield forecasting. This innovative solution gives farmers and agricultural planners managing a small farm or large operations the ability to make better decisions based on data.

Keywords

Corn yield forecasting, GRU, Multicollinearity, PCA, Vanishing gradient.

References

[1] Hongkun Fu et al., “Winter Wheat Yield Prediction using Satellite Remote Sensing Data and Deep Learning Models,” Agronomy, vol. 15, no. 1, pp. 1-21, 2025.
[
CrossRef] [Google Scholar] [Publisher Link]

[2] Sarmistha Saha et al., “Precision Agriculture for Improving Crop Yield Predictions: A Literature Review,” Frontiers in Agronomy, vol. 7, pp. 1-11, 2025.
[
CrossRef] [Google Scholar] [Publisher Link]

[3] Ming Dong, and Lukas Grumbach, “A Hybrid Distribution Feeder Long-Term Load Forecasting Method based on Sequence Prediction,” IEEE Transactions on Smart Grid, vol. 11, no. 1, pp. 470-478, 2019.
[
CrossRef] [Google Scholar] [Publisher Link]

[4] Thirupathi Kandadi, and G. Shankarlingam, “Drawbacks of LSTM Algorithm: A Case Study,” SSRN, pp. 1-15, 2025.
[
CrossRef] [Google Scholar] [Publisher Link]

[5] Dimitris Bertsimas, Agni Orfanoudaki, and Colin Pawlowski, “Imputation of Clinical Covariates in Time Series,” Machine Learning, vol. 110, no. 1, pp. 185-248, 2020.
[
CrossRef] [Google Scholar] [Publisher Link]

[6] Ke Man et al., “Prediction of Rock Mass Classification in Tunnel Boring Machine Tunneling using the Principal Component Analysis (PCA)-Gated Recurrent Unit (GRU) Neural Network,” Deep Underground Science & Engineering, vol. 3, no. 4, pp. 413-425, 2024.  
[
CrossRef] [Google Scholar] [Publisher Link]

[7] Saeed Khaki, and Lizhi Wang, “Crop Yield Prediction using Deep Neural Networks,” Frontiers in Plant Science, vol. 10, pp. 1-10, 2019.
[CrossRef] [Google Scholar] [Publisher Link]

[8] Djavan De Clercq, and Adam Mahdi, “Feasibility of Machine Learning‑Based Rice Yield Prediction in India at the District Level using Climate Reanalysis Data,” arXiv Preprint, pp. 1-22, 2024.
[CrossRef] [Google Scholar] [Publisher Link]

[9] Khadija Meghraoui et al., “Applied Deep Learning‑Based Crop Yield Prediction: A Systematic Analysis of Current Developments and Potential Challenges,” Technologies, vol. 12, no. 4, pp. 1-30, 2024.
[
CrossRef] [Google Scholar] [Publisher Link]

[10] Heikki Junninen et al., “Methods for Imputation of Missing Values in Air Quality Data Sets,” Atmospheric Environment, vol. 38, no. 18, pp. 2895-2907, 2004.
[
CrossRef] [Google Scholar] [Publisher Link]

[11] Christos Platias, and Georgios Petasis, “A Comparison of Machine Learning Methods for Data Imputation,” SETN 2020: 11th Hellenic Conference on Artificial Intelligence, Athens, Greece, pp. 150-159, 2020.
[
CrossRef] [Google Scholar] [Publisher Link]

[12] Tlamelo Emmanuel et al., “A Survey on Missing Data in Machine Learning,” Journal of Big Data, vol. 8, no. 1, pp. 1-37, 2021.
[
CrossRef] [Google Scholar] [Publisher Link]

[13] Khaled M. Fouad et al., “Advanced Methods for Missing Values Imputation based on Similarity Learning,” PeerJ Computer Science, vol. 7, pp. 1-38, 2021.
[
CrossRef] [Google Scholar] [Publisher Link]

[14] Elife Ozturk Kiyak, Bita Ghasemkhani, and Derya Birant, “High-Level K-Nearest Neighbors (HLKNN): A Supervised Machine Learning Model for Classification Analysis,” Electronics, vol. 12, no. 18, pp. 1-20, 2023.
[
CrossRef] [Google Scholar] [Publisher Link]

[15] Rajib Kumar Halder et al., “Enhancing K-Nearest Neighbor Algorithm: A Comprehensive Review and Performance Analysis of Modifications,” Journal of Big Data, vol. 11, no. 1, pp. 1-55, 2024.
[
CrossRef] [Google Scholar] [Publisher Link]

[16] Kirtan Jha et al., “A Comprehensive Review on Automation in Agriculture using Artificial Intelligence,”
Artificial Intelligence in Agriculture, vol. 2, pp. 1-12, 2019.
[CrossRef] [Google Scholar] [Publisher Link]

[17] Juan Botero-Valencia et al., “Machine Learning in Sustainable Agriculture: Systematic Review and Research Perspectives,” Agriculture, vol. 15, no. 4, pp. 1-37, 2025.
[
CrossRef] [Google Scholar] [Publisher Link]

[18] Anna Chlingaryan, Salah Sukkarieh, and Brett Whelan, “Machine Learning Approaches for Crop Yield Prediction and Nitrogen Status Estimation in Precision Agriculture: A Review,” Computers and Electronics in Agriculture, vol. 151, pp. 61-69, 2018.
[
CrossRef] [Google Scholar] [Publisher Link]

[19] Jovan Andjelkovic et al., “Sequential Machine Learning in Prediction of Common Cancers,” Informatics in Medicine Unlocked,
vol. 30, pp. 1-10, 2022.
[
CrossRef] [Google Scholar] [Publisher Link]

[20] Tej Bahadur Shahi et al., “Stock Price Forecasting with Deep Learning: A Comparative Study,” Mathematics, vol. 8, no. 9, pp. 1-15, 2020.
[
CrossRef] [Google Scholar] [Publisher Link]

[21] Carlos Vega-Ortiz et al., “Decline Curve Analysis Using Machine Learning Algorithms: RNN, LSTM, and GRU,” Proceedings of the ARMA US Rock Mechanics/Geomechanics Symposium, Atlanta, Georgia, USA, 2023.
[
CrossRef] [Google Scholar] [Publisher Link]

[22] Haowen Xie, Mark Randall, and Kwok-wing Chau, “Green Roof Hydrological Modelling with GRU and LSTM Networks,” Water Resources Management, vol. 36, no. 3, pp. 1107-1122, 2022.
[
CrossRef] [Google Scholar] [Publisher Link]

[23] Shudong Yang, Xueying Yu, and Ying Zhou, “LSTM and GRU Neural Network Performance Comparison Study: Taking Yelp Review Dataset as an Example,” 2020 International Workshop on Electronic Communication and Artificial Intelligence (IWECAI), Shanghai, China, pp. 98-101, 2020.
[
CrossRef] [Google Scholar] [Publisher Link]

[24] Hannan Asrawi, Ema Utami, and Ainul Yaqin, “LSTM and Bidirectional GRU Comparison for Text Classification,” Sinkron: Journal and Research of Informatics Engineering, vol. 7, no. 4, pp. 2264-2274, 2023.
[
CrossRef] [Google Scholar] [Publisher Link]

[25] Yuchen Liu, Chaoxu Li, and Man Li, “Data-Driven Online State Prediction Method for the Traction Motors of Electric Multiple Units (EMUs),” Sustainability, vol. 17, no. 9, pp. 1-21, 2025.
[CrossRef] [Google Scholar] [Publisher Link]

[26] A.A. Mana et al., “Sustainable AI-Based Production Agriculture: Exploring AI Applications and Implications in Agricultural Practices,” Smart Agricultural Technology, vol. 7, pp. 1-15, 2024.
[
CrossRef] [Google Scholar] [Publisher Link]

[27] Jiazhi Xia et al., “Revisiting Dimensionality Reduction Techniques for Visual Cluster Analysis: An Empirical Study,” IEEE Transactions on Visualization and Computer Graphics, vol. 28, no. 1, pp. 529-539, 2021.
[
CrossRef] [Google Scholar] [Publisher Link]

[28] Aris Magklaras et al., “Enhancing Parameters Tuning of Overlay Models with Ridge Regression: Addressing Multicollinearity in High-Dimensional Data,” Mathematics, vol. 12, no. 20, pp. 1-13, 2024.
[
CrossRef] [Google Scholar] [Publisher Link]

[29] Seungbum Koo, Dongik Shin, and Changhyuk Kim, “Application of Principal Component Analysis Approach to Predict Shear Strength of Reinforced Concrete Beams with Stirrups,” Materials, vol. 14, no. 13, pp. 1-20, 2021.
[
CrossRef] [Google Scholar] [Publisher Link]

[30] Vladimír Kunc, and Jiří Kléma, “Three Decades of Activations: A Comprehensive Survey of 400 Activation Functions for Neural Networks,” arXiv Preprint, pp. 1-107, 2024.
[
CrossRef] [Google Scholar] [Publisher Link]

[31] Alejandro Molina, Patrick Schramowski, and Kristian Kersting, “Padé Activation Units: End-to-End Learning of Flexible Activation Functions in Deep Networks,” Proceedings of the International Conference on Learning Representations (ICLR), pp. 1-17, 2020. [Google Scholar] [Publisher Link]

[32] Chigozie Nwankpa et al., “Activation Functions: Comparison of Trends in Practice and Research for Deep Learning,” arXiv Preprint, pp. 1-20, 2018.
[
CrossRef] [Google Scholar] [Publisher Link]

[33] Qi Zhang, and Teng Wang, “Deep Learning for Exploring Landslides with Remote Sensing and Geo-Environmental Data: Frameworks, Progress, Challenges, and Opportunities,” Remote Sensing, vol. 16, no. 8, pp. 1-47, 2024.
[
CrossRef] [Google Scholar] [Publisher Link]

[34] Andrea Apicella et al., “A Survey on Modern Trainable Activation Functions,” Neural Networks, vol. 138, pp. 14-32, 2021.
[
CrossRef] [Google Scholar] [Publisher Link]

[35] Djork-Arné Clevert, Thomas Unterthiner, and Sepp Hochreiter, “Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs),” arXiv Preprint, pp. 1-14, 2016.
[
CrossRef] [Google Scholar] [Publisher Link]