Adaptive hybrid optimization for backpropagation neural networks in image classification
Keywords:
Hybrid optimization, Backpropagation neural networks, Particle swarm optimization, AdaGrad optimizationAbstract
Image classification is essential in artificial intelligence, with applications in medical diagnostics, autonomous navigation, and industrial automation. Traditional training methods like stochastic gradient descent (SGD) often suffer from slow convergence and local minima. This research presents a hybrid Particle Swarm Optimization (PSO)-Genetic Algorithm (GA)-Backpropagation framework to enhance neural network training. By integrating AdaGrad and PSO for weight optimization, GA for refinement, and backpropagation for fine-tuning, the model improves performance. Results show a 97.5% accuracy on MNIST, a 5% improvement over Adam, and 40% faster convergence than SGD. This approach enhances efficiency, accuracy, and generalization, making it valuable for high-dimensional AI tasks.

Published
How to Cite
Issue
Section
Copyright (c) 2025 Samuel O. Essang, Stephen I. Okeke, Jackson E. Ante, Runyi E. Francis, Sunday E. Fadugba, Augustine O. Ogbaji, Jonathan T. Auta, Chikwe F. Chukwuka, Michael O. Ogar-Abang, Ede M. Aigberemhon (Author)

This work is licensed under a Creative Commons Attribution 4.0 International License.