Hyperparameter optimization of graph neural networks for predicting complex network dynamics using bayesian meta-learning
DOI:
https://doi.org/10.35335/mandiri.v14i2.469Keywords:
Bayesian Meta-Learning, Graph Neural Network, Hyperparameter Optimization, Network Dynamics Prediction, Probabilistic InferenceAbstract
The rapid growth of graph-structured data in domains such as transportation, social networks, and biological systems has increased the demand for more adaptive and efficient Graph Neural Network (GNN) architectures. However, GNN performance remains highly sensitive to hyperparameter configurations, which are often tuned through computationally expensive manual or heuristic methods. This study proposes a novel Bayesian Meta-Learning (BML)-based framework for hyperparameter optimization of GNNs aimed at improving the prediction accuracy of complex network dynamics. The framework integrates Bayesian optimization with a meta-learning prior adaptation mechanism, enabling the model to learn optimal hyperparameter distributions across multiple graph tasks. Experimental evaluations conducted on three benchmark datasets—Cora, Citeseer, and PubMed—comprising up to 20,000 nodes with diverse structural complexities, demonstrate that the proposed BML-GNN framework achieves faster convergence, lower validation loss, and higher predictive accuracy than both baseline GNN and traditional Bayesian Optimization approaches. Quantitatively, the BML-GNN model attains an R² score exceeding 0.97 with a significant reduction in RMSE, confirming its strong generalization capability. Although the method shows notable performance improvements, its computational overhead during meta-training and reliance on well-defined prior distributions represent potential limitations. Overall, the integration of Bayesian Meta-Learning provides a robust, scalable, and uncertainty-aware optimization strategy that advances the development of reliable GNN models for complex network modeling and intelligent system design.
References
Ahmed, M. J., Mozo, A., & Karamchandani, A. (2025). A survey on graph neural networks, machine learning and deep learning techniques for time series applications in industry. PeerJ Computer Science, 11, e3097.
Ali, Y. A., Awwad, E. M., Al-Razgan, M., & Maarouf, A. (2023). Hyperparameter Search for Machine Learning Algorithms for Optimizing the Computational Complexity. Processes, 11(2), 349. https://doi.org/10.3390/pr11020349
Asif, N. A., Sarker, Y., Chakrabortty, R. K., Ryan, M. J., Ahamed, M. H., Saha, D. K., Badal, F. R., Das, S. K., Ali, M. F., Moyeen, S. I., Islam, M. R., & Tasneem, Z. (2021). Graph Neural Network: A Comprehensive Review on Non-Euclidean Space. IEEE Access, 9, 60588–60606. https://doi.org/10.1109/ACCESS.2021.3071274
Bergstra, J., & Bengio, Y. (2012). Random search for hyper-parameter optimization. The Journal of Machine Learning Research, 13(1), 281–305.
Budihal, S., Kawale, S. R., Junnarkar, A. A., Begam, H. F., & M., G. (2023). A Comparative Study of Machine Learning Models for Heart Disease Prediction. Advances and Challenges in Science and Technology Vol. 2, 8(01), 59–71. https://doi.org/10.9734/bpi/acst/v2/6348c
Chen, Q., Bu, S., Wang, H., & Lei, C. (2024). Real-time Multi-stability Risk Assessment and Visualization of Power Systems: A Graph Neural Network-based Method. IEEE Transactions on Power Systems.
Corradini, F., Gerosa, F., Gori, M., Lucheroni, C., Piangerelli, M., & Zannotti, M. (2025). A Systematic Literature Review of Spatio-Temporal Graph Neural Network Models for Time Series Forecasting and Classification. ArXiv Preprint ArXiv:2410.22377. http://arxiv.org/abs/2410.22377
Gad, A. G. (2022). Particle swarm optimization algorithm and its applications: A systematic review. Archives of Computational Methods in Engineering, 29(5).
Hasanzadeh, A., Hajiramezanali, E., Boluki, S., Zhou, M., Duffield, N., Narayanan, K., & Qian, X. (2020). Bayesian graph neural networks with adaptive connection sampling. International Conference on Machine Learning, 4094–4104.
Iqbal, J., Khan, M., Talha, M., Farman, H., Jan, B., Muhammad, A., & Khattak, H. A. (2018). A generic internet of things architecture for controlling electrical energy consumption in smart homes. Sustainable Cities and Society, 43, 443–450. https://doi.org/10.1016/j.scs.2018.09.020
Jing, Y. (2023). Efficient representation learning with graph neural networks.
Liu, J., & Wang, H. (2021). Graph Isomorphism Network for Speech Emotion Recognition. Interspeech, 3405–3409.
Maldini, A. (2024). Review of Meta-Heuristics Methods for Machine Learning with Graph Neural Network. 1, 1–5.
Manurung, J., & Saragih, H. (2024). Performance comparison of naive bayes and support vector machine algorithms in spambot classification in emails. International Journal of Basic and Applied Science, 13(3), 137–145. https://doi.org/10.35335/ijobas.v13i3.522
Manurung, J., Sihombing, P., Budiman, M. A., & Sawaluddin, S. (2025). Deep learning approaches for analyzing and controlling rumor spread in social networks using graph neural networks. Bulletin of Electrical Engineering and Informatics, 14(1), 581–586.
Manurung, J., & Sihotang, H. T. (2025). Recurrent neural network for adaptive cyber attack prediction on critical defense systems. Journal of Defense Technology and Engineering, 1(1), 52–64.
Munikoti, S., Agarwal, D., Das, L., Halappanavar, M., & Natarajan, B. (2024). Challenges and Opportunities in Deep Reinforcement Learning With Graph Neural Networks: A Comprehensive Review of Algorithms and Applications. IEEE Transactions on Neural Networks and Learning Systems, 35(11), 15051–15071. https://doi.org/10.1109/TNNLS.2023.3283523
Oladele Junior Adeyeye, & Ibrahim Akanbi. (2024). Artificial Intelligence for Systems Engineering Complexity: a Review on the Use of Ai and Machine Learning Algorithms. Computer Science & IT Research Journal, 5(4), 787–808. https://doi.org/10.51594/csitrj.v5i4.1026
Ponzi, V., & Napoli, C. (2025). Graph Neural Networks: Architectures, Applications, and Future Directions. IEEE Access, 13, 62870–62891. https://doi.org/10.1109/ACCESS.2025.3558752
Rane, N., Paramesha, M., Choudhary, S., & Rane, J. (2024). Artificial Intelligence, Machine Learning, and Deep Learning for Advanced Business Strategies: a Review. SSRN Electronic Journal, 2(3), 147–171. https://doi.org/10.2139/ssrn.4835661
Sarkar, D., Roy, S., Malakar, S., & Sarkar, R. (2023). A modified GNN architecture with enhanced aggregator and message passing functions. Engineering Applications of Artificial Intelligence, 122, 106077.
Sharma, K., Lee, Y. C., Nambi, S., Salian, A., Shah, S., Kim, S. W., & Kumar, S. (2024). A Survey of Graph Neural Networks for Social Recommender Systems. ACM Computing Surveys, 56(10), 1–34. https://doi.org/10.1145/3661821
Tahkola, M. (2019). Developing dynamic machine learning surrogate models of physics-based industrial process simulation models. M. Tahkola.
Vincent, A. M., Jidesh, P., & Bini, A. A. (2025). Optimizing Hyperparameters in Meta-Learning for Enhanced Image Classification. IEEE Access, 13, 130816–130831. https://doi.org/10.1109/ACCESS.2025.3591142
Vrahatis, A. G., Lazaros, K., & Kotsiantis, S. (2024). Graph attention networks: a comprehensive review of methods and applications. Future Internet, 16(9), 318.
Waikhom, L., & Patgiri, R. (2023). A survey of graph neural networks in various learning paradigms: methods, applications, and challenges. Artificial Intelligence Review, 56(7), 6295–6364. https://doi.org/10.1007/s10462-022-10321-2
Yang, L., Chatelain, C., & Adam, S. (2024). Dynamic graph representation learning with neural networks: A survey. Ieee Access, 12, 43460–43484.
Yuan, Y., Wang, W., & Pang, W. (2021). A Genetic Algorithm with Tree-structured Mutation for Hyperparameter Optimisation of Graph Neural Networks. 2021 IEEE Congress on Evolutionary Computation, CEC 2021 - Proceedings, 482–489. https://doi.org/10.1109/CEC45853.2021.9504717
Zhou, J., Liu, L., Wei, W., & Fan, J. (2023). Network Representation Learning: From Preprocessing, Feature Extraction to Node Embedding. ACM Computing Surveys, 55(2), 1–35. https://doi.org/10.1145/3491206
Zhou, Y. (2025). Meta-Learned Dynamic Distillation for Automated Hyperparameter Optimization in Machine Learning Systems. Journal of Engineering Systems and Applications, 1(1), 8–18. https://doi.org/10.63802/jesa.v1.i1.33
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Hondor Saragih, Jonson Manurung, Muhammad Azhar Prabukusumo, Eryan Ahmad Firdaus

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.




