CFP last date
22 April 2024
Reseach Article

A Hybrid Differential Evolution and Back-Propagation Algorithm for Feedforward Neural Network Training

by Partha Pratim Sarangi, Abhimanyu Sahu, Madhumita Panda
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 84 - Number 14
Year of Publication: 2013
Authors: Partha Pratim Sarangi, Abhimanyu Sahu, Madhumita Panda
10.5120/14641-2943

Partha Pratim Sarangi, Abhimanyu Sahu, Madhumita Panda . A Hybrid Differential Evolution and Back-Propagation Algorithm for Feedforward Neural Network Training. International Journal of Computer Applications. 84, 14 ( December 2013), 1-9. DOI=10.5120/14641-2943

@article{ 10.5120/14641-2943,
author = { Partha Pratim Sarangi, Abhimanyu Sahu, Madhumita Panda },
title = { A Hybrid Differential Evolution and Back-Propagation Algorithm for Feedforward Neural Network Training },
journal = { International Journal of Computer Applications },
issue_date = { December 2013 },
volume = { 84 },
number = { 14 },
month = { December },
year = { 2013 },
issn = { 0975-8887 },
pages = { 1-9 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume84/number14/14641-2943/ },
doi = { 10.5120/14641-2943 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T22:00:52.760665+05:30
%A Partha Pratim Sarangi
%A Abhimanyu Sahu
%A Madhumita Panda
%T A Hybrid Differential Evolution and Back-Propagation Algorithm for Feedforward Neural Network Training
%J International Journal of Computer Applications
%@ 0975-8887
%V 84
%N 14
%P 1-9
%D 2013
%I Foundation of Computer Science (FCS), NY, USA
Abstract

In this study a hybrid differential evolution-back-propagation algorithm to optimize the weights of feedforward neural network is proposed. The hybrid algorithm can achieve faster convergence speed with higher accuracy. The proposed hybrid algorithm combining differential evolution (DE) and back-propagation (BP) algorithm is referred to as DE-BP algorithm to train the weights of the feed-forward neural (FNN) network by exploiting global searching feature of the DE evolutionary algorithm and strong local searching ability of the BP algorithm. The DE has faster exploration property during initial stage of global search for the expense of convergence speed. On the contrary, the problem of random initialization of weights may lead to getting stuck at local minima of the gradient based BP algorithm. In the proposed hybrid algorithm, initially we use global searching ability of the DE to move towards global optimal solution in the search space for few generations by selecting good starting weights and then precise local gradient searching of the BP in that region to converge to the optimal solution with increased speed of convergence. The performance of proposed DE-BP is investigated on a couple of public domain datasets, the experimental results are compared with the BP algorithm, the DE evolutionary training algorithm and a hybrid real-coded GA with back-propagation (GA-BP) algorithm . The results show that the proposed hybrid DE-BP algorithm produce promising results in comparison with other training algorithms.

References
  1. J. Ilonen, J. K. Kamarainen, and J. Lampinen: Differential evolution training algorithm for feed-forward neural networks, Neural Processing Letters, 17:93-105, 2003.
  2. P. P. Sarangi, B. Majhi and M. Panda, "Performance Analysis of Neural Networks Training using Real Coded GeneticAlgorithm", International Journal of Computer Applications 51(18):30-36, 2012
  3. H. Hasanrkc, HasanBal, "Comparing performances of backpropagation and genetic algorithms in the data classification", Expert Systems with Applications, Volume 38, Issue 4,Pages 3703-3709, 2011.
  4. Zhang, G. , "Neural networks for classification: a survey", IEEE Transactions on Systems, Man, and Cybernetics, Part C 30(4): 451-462, 2000.
  5. S. B. Kotsiantis, "Supervised Machine Learning: A Review of Classification Techniques", Informatica31 249-268, 2007
  6. Curry B, Morgan P. Neural networks: a need for caution. Omega, International Journal of Management Sciences, 1997
  7. D. J. Montana and L. Davis, Training Feedforward Neural Networks using Genetic Algorithms, Proceedings of the Third International Conference on Genetic Algorithms, Morgan Kaufmann, San Mateo, CA, 379-384,1989
  8. Sexton, R. , Dorsey, R. , and Johanson, J. , "Optimization of Neural Networks: A Comparative Analysis of the Genetic Algorithm and Simulated Annealing", European Journal of Operational Research, volume 114, issue 3,page 589-601,1999.
  9. Sexton, R. , Dorsey, R. , and Johanson, J. , "Toward a Global Optimization for Neural Networks:, A Comparison of the Genetic Algorithm and Backpropagation", forthcoming in Decision Support Systems.
  10. Gupta, J. N. D. , and Sexton, R. S. Comparing backpropagation with a genetic algorithm for neural network training. Omega, 27, 679-684.
  11. X. Yao, Evolving artificial neural networks, Proc. IEEE 87 (9), 1423-1447, 1999.
  12. Werbos P. The roots of the backpropagation: from ordered derivatives to neural networks and political forecasting. New York: John Wiley and Sons, Inc, 1993
  13. D. E. Rumelhart, G. E. Hinton, R. J. Williams, Learning representations by back-propagating errors, Nature 323 533-536, 1986
  14. Simon Haykin, "Neural Networks: A comprehensive foundation," Pearson Education Asia, Seventh Indian Reprint, 2004.
  15. D. Whitley, "Applyong Genetic Algorithms to Neural Network Problems," International Neural Network Society pp. 230, 1988.
  16. Prechelt, L. : Proben1, "A Set of Neural Network Benchmark Problems and Benchmarking Rules". Technical Report 21, FakultatfurInformatik University at Karlsruhe, 76128 Karlsruhe, Germany, 1994.
  17. Schaffer J. D. , D. Whitley, and L. J. Eshelman, "Combinations of Genetic Algorithms and Neural Networks: A Survey of the State of the Art, " Proceedings of the IEEE Workshop on Combinations of Genetic Algorithms and Neural Network.
  18. C. Zhang, H. Shao. and Y. Li. "Panick swarm optimization for evolving artificial neural network", Procccdings of the IEEE lntcrnational Conference 011 System, Man, and Cybcmctics. vol. 4, pp. 24x7-2490, 2000.
  19. Zhang, J. R. , Zhang, J. , Lok, T. M. , and Lyu, M. R. (2007). A hybrid particle swarm optimization-back-propagation algorithm for feedforward neural network training. Applied Mathematics and Computation 185, 1026 - 1037.
  20. Zhang, C. , and Shao, H. , An ANN's evolved by a new evolutionary system and its application. In Proc. of the 39th IEEE conf. on decision and control. vol. 4, pp. 3562 - 3563, 2000.
  21. M. Sellcs and B. Rylander, "Neural network learning using particle swarm optimization", Advances in Information Science and SoftComputing. pp. 224-226, 2002
  22. R. Storn, and K. Price, "Differential Evolution-A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces, " Journal of Global Optimization, Vol. 11, pp. 341- 359, 1997.
  23. Adam Slowik, and Michal Bialko, "Training of Artificial Neural Networks Using Differential Evolution Algorithm", Krakow, Poland, May 25-27, 2008
  24. UCI repository of machine learning databases, Department of Information and Computer Sciences, University of California, Irvine, http://www. ics. uci. edu/ mlearn/MLRepositor.
Index Terms

Computer Science
Information Sciences

Keywords

Differential Evolution Feedforward Neural Network Backpropagation algorithm Real-Coded Genetic algorithms