International Journal of Multidisciplinary Engineering Research & Reviews

Published by Publisher Winkley Publication

eISSN: 2945-4565

Optimization of Neural Network Training Using Adaptive Learning Schedules and Meta-Heuristic Algorithms

Published Dec 25, 2025

Abstract

A comprehensive method is adopted in this paper to evaluate the impact of batch size and learning rate on the performance of neural network training in terms of accuracy, loss, and the normality of residuals. The experiments conducted with four different batch sizes of (16, 32, 64, and 128) have shown that even though the small batch sizes result in higher accuracy and lower loss, they also require more learning rates to increase variability. The model behavior under various training setups is highlighted by descriptive statistics and extreme value analyses. The Q-Q plot analysis of the standardized residuals supports the normality of the residuals to some extent thus confirming the reliability of the regression-based performance evaluation. The findings underscore the importance of hyperparameter tuning, especially of batch size and learning rate, in attaining the best performance for neural network training. The research includes unambiguous instructions on selecting the training setups that offer a combination of reduced computational burden and improved accuracy of predictions. Moreover, the results push for the use of adaptive learning schedules as well as meta-heuristic algorithms to continuously and dynamically fine-tune the training parameters. This work also lays down the basic understanding required to progress neuromorphic optimization in different application domains.