Call for Paper - May 2023 Edition
IJCA solicits original research papers for the May 2023 Edition. Last date of manuscript submission is April 20, 2023. Read More

A Homogeneous Ensemble of Artificial Neural Networks for Time Series Forecasting

International Journal of Computer Applications
© 2011 by IJCA Journal
Number 1 - Article 1
Year of Publication: 2011
Ratnadip Adhikari
R. K. Agrawal

Ratnadip Adhikari and R K Agrawal. Article:A Homogeneous Ensemble of Artificial Neural Networks for Time Series Forecasting. International Journal of Computer Applications 32(7):1-8, October 2011. Full text available. BibTeX

	author = {Ratnadip Adhikari and R. K. Agrawal},
	title = {Article:A Homogeneous Ensemble of Artificial Neural Networks for Time Series Forecasting},
	journal = {International Journal of Computer Applications},
	year = {2011},
	volume = {32},
	number = {7},
	pages = {1-8},
	month = {October},
	note = {Full text available}


Enhancing the robustness and accuracy of time series forecasting models is an active area of research. Recently, Artificial Neural Networks (ANNs) have found extensive applications in many practical forecasting problems. However, the standard backpropagation ANN training algorithm has some critical issues, e.g. it has a slow convergence rate and often converges to a local minimum, the complex pattern of error surfaces, lack of proper training parameters selection methods, etc. To overcome these drawbacks, various improved training methods have been developed in literature; but, still none of them can be guaranteed as the best for all problems. In this paper, we propose a novel weighted ensemble scheme which intelligently combines multiple training algorithms to increase the ANN forecast accuracies. The weight for each training algorithm is determined from the performance of the corresponding ANN model on the validation dataset. Experimental results on four important time series depicts that our proposed technique reduces the mentioned shortcomings of individual ANN training algorithms to a great extent. Also it achieves significantly better forecast accuracies than two other popular statistical models.


  • G.E.P. Box, G.M. Jenkins, Time Series Analysis: Forecasting and Control, 3rd ed. Holden-Day, California, 1970.
  • G.P. Zhang, “Time series forecasting using a hybrid ARIMA and neural network model,” Neurocomputing 50, pp.159–175, 2003
  • G.P. Zhang, “A neural network ensemble method with jittered training data for time series forecasting,” Information Sciences 177, pp. 5329–5346, 2007.
  • G. Zhang, B.E. Patuwo, M.Y. Hu, “Forecasting with artificial neural networks: The state of the art,” International Journal of Forecasting 14, pp.35–62, 1998.
  • J. Kamruzzaman, R. Begg, R. Sarker, Artificial Neural Networks in Finance and Manufacturing, Idea Group Publishing, 2006.
  • M. Adya, F. Collopy, “How effective are neural networks at forecasting and prediction? A review and evaluation,” Journal of Forecasting 17, pp. 481–495, 1998.
  • D.E. Rumelhart, G.E. Hinton, R. J. Williams, “Learning representations by back-propagating errors,” Nature 323 (6188), pp. 533-536, 1986.
  • M. Hagan, M. Menhaj, "Training feedforward networks with the marquardt algorithm," IEEE Transactions on Neural Networks, vol. 5, no. 6, pp. 989–993, November 1994.
  • M. Reidmiller, H. Braun, "A direct adaptive method for faster backpropagation learning: The rprop algorithm," In Proceedings of the IEEE Int. Conference on Neural Networks (ICNN), San Francisco, pp. 586–591, 1993.
  • M.F. Moller, "A scaled conjugate gradient algorithm for fast supervised learning," Neural Networks 6, pp. 525–533, 1993.
  • R. Battiti, "One step secant conjugate gradient," Neural Computation 4, pp. 141–166, 1992.
  • J.E. Dennis, R.B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear Equations, Englewood Cliffs, NJ: Prentice-Hall, 1983.
  • J. Kennedy, R.C. Eberhart, Y. Shi, Swarm Intelligence, Morgan Kaufmann, San Francisco, CA, 2001.
  • G.K. Jha, P. Thulasiraman, R.K. Thulasiram, “PSO based neural network for time series forecasting,” In Proceedings of the IEEE International Joint Conference on Neural Networks, Atlanta, Georgia, USA, pp. 1422–1427 June 14–19, 2009.
  • R. Fletcher, Practical Methods of Optimization, 2nd ed. John Wiley, Chichester, 1987.
  • C. de Groot, D. Wurtz, “Analysis of univariate time series with connectionist nets: a case study of two classical examples,” Neurocomputing 3, pp. 177–192, 1991.
  • J. Scott Armstrong, Combining Forecasts, Principles of Forecasting: A Handbook for Researchers and Practitioners; J. Scott Armstrong (ed.): Norwell, MA: Kluwer Academic Publishers, 2001.
  • H. Demuth, M. Beale, M. Hagan, Neural Network Toolbox User's Guide, Natic, MA, the MathWorks, 2010.
  • I. Trelea, "The particle swarm optimization algorithm: convergence analysis and parameter selection," Information Processing Letters 85, pp. 317–325, 2003.
  • R.J. Hyndman, Time Series Data Library, URL:, January, 2010.
  • V. Vapnik, Statistical Learning Theory, New York, Springer-Verlag, 1995.
  • J.A.K. Suykens and J. Vandewalle, “Least squares support vector machines classifiers”, Neural Processing Letters, vol. 9, no. 3, pp. 293–300, 1999.
  • Y. Fan, P. Li, Z. Song, “Dynamic least square support vector machine”, Proceedings of the 6th World Congress on Intelligent Control and Automation (WCICA), Dalian, China, pp. 4886-4889, June 21–23, 2006.
  • Birge, B.: PSOt-A Particle Swarm Optimization Toolbox for use with Matlab, Proceedings of the IEEE Swarm Intelligence Symposium, pp. 182-186. Indianapolis, Indiana, USA, 2003.
  • K.W. Hipel, A.I. McLeod, Time Series Modelling of Water Resources and Environmental Systems, Amsterdam, Elsevier, 1994.
  • C. Hamzacebi, “Improving artificial neural networks performance in seasonal time series forecasting,” Information Sciences 178, pp. 4550–4559, 2008.