Call for Paper - November 2023 Edition
IJCA solicits original research papers for the November 2023 Edition. Last date of manuscript submission is October 20, 2023. Read More

Analysis of Randomized Performance of Bias Parameters and Activation Function of Extreme Learning Machine

International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Year of Publication: 2016
Prafull Pandey, Ram Govind Singh

Prafull Pandey and Ram Govind Singh. Article: Analysis of Randomized Performance of Bias Parameters and Activation Function of Extreme Learning Machine. International Journal of Computer Applications 135(1):23-28, February 2016. Published by Foundation of Computer Science (FCS), NY, USA. BibTeX

	author = {Prafull Pandey and Ram Govind Singh},
	title = {Article: Analysis of Randomized Performance of Bias Parameters and Activation Function of Extreme Learning Machine},
	journal = {International Journal of Computer Applications},
	year = {2016},
	volume = {135},
	number = {1},
	pages = {23-28},
	month = {February},
	note = {Published by Foundation of Computer Science (FCS), NY, USA}


In Artificial Intelligence classification is a process of identifying classes of a different entities on the basis information provided from the dataset. Extreme Learning Machine (ELM) is one of the efficient classifiers. ELM is formed by interconnected layers. Each layer has many nodes (neurons). The input layer communicates with hidden layer with random weight and produces output layer with the help of activation function (transfer function). Activation functions are non-linear functions and different activation functions may produce different output on same dataset. Not every activation function is suited for every type classification problem. This paper shows the variation of average test accuracy with various activation functions. Along with it also has been shown that how much performance varied due to selection of random bias parameter between input and hidden layer of ELM.


  1. Guang-Bin Huang, Hongming Zhou, Xiaojian Ding,  Rui Zhang, “Extreme Learning Machine for Regression and Multiclass Classification”, IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, 2011.
  2. S.Tamura, M.Tateishi, ”Capability and storage of four layered feedward neural network”, IEEE Transaction on Neural Network, 1997.
  3. Guang-Bin Haung, “Learning capability and storage capacity of two-hidden-layer feedforward networks”, IEEE Transaction on Neural Network, 2003.
  4. Guang-Bin Huang, Qin-Yu Zhu, Chee-Kheong Siew,” Extreme learning machine: Theory and applications”, Proceedings of International Joint Conference on Neural Networks, Budapest, Hungary,2004.
  5. Jim Y. F. Yam and Tommy W.S. Chow, “Feedforward networks training speed enhancement by optimal initialization of the synaptic coefficients”, IEEE Trans. On Neural Networks, vol.12, no.2, pp. 430-434, March2001.
  6. Karayiannis and A.N. Venetsanopoulos, “Artificial neural networks: learning algorithms” performance evaluation, and application”, Kluver Academic, pp.135-158, 1993.
  7. Y. LeCun, L.Bottou, G.B. Orr and K.R. Muller, “Efficient backprop” Lecture Notes in Computer Science, Vol.15, no. 24, pp.9-50,1998.
  8. P. Lingras, C. Butz , “Rough set based 1-v-1 and 1-v-r approaches to support vector machine multi-classification”, Information Science177 (18) (2007) 3783-3798.
  9. V. N. Vapnik, “The nature of Statistical Learning Theory,” New York: Spriger-Verlag, 1995.
  10. P.O. Duda, P.E. Hart, Pattern Classification and Scene Analysis , Wiley, New York, 1973
  11. S. Haykin, Neural Networks, A Comprehensive foundation, second ed, Pearson education Press, 2001.
  12. S.k. Kay, Fundamentals of Statistical Signal Processing: Detection Theory, 1st ed., Prentice Hall,1998.
  13. G Wang and P Li, “Dynamic Adaboost Ensemble Extreme Learning Machine”, 2010 3rd International Conference on Advanced Computer Theory and Engineering (ICACTE).
  14. Guang-Bin Haung, Babri H A, “Upper bounds on the number of hidden neurons in feedforward networks with arbitrary bounded nonlinear activation function”, IEEE Transaction on Nerual Network, 1998.
  15. Guang-Bin Haung, Q.Y. Zuhu, C.K. Siew, “Extreme Learning Machine: A New Learning Scheme of Feedward Neural Network”, 2004 International Join Conference on Neural Networks, Budapest Hungar, 2004.
  16. Guang-Bin Haung, Q.Y. Zuhu, C.K. Siew, “Extreme Learning Machine: Theory and Applications, Neurocomputing, 2006.
  17. S.S.Haykin,“Neural Networks and Learning Machine”, Prentice Hall, 2009
  18. Khaled Fawagreh, Mohamed Medhat Gaber, Eyad Elyan, “Random forests: from early developments to recent advancements” , Systems Science & Control Engineering: An Open Access Journal, 2014.
  19. David Williams, Xuejun Liao, Ya Xue, Lawrence Carin, “Incomplete-Data Classification using Logistic Regression”, 22nd International Conference on Machine Learning, Bonn, Germany, 2005.
  20. Włodzisław Duch, Norbert Jankowski, “Survey of Neural Transfer Functions”, NEURAL COMPUTING SURVEYS 2, 163-212, 1999.
  21. Anguita, D., Ghio, A., Greco, N., Oneto, L., Ridella, S.,“Model selection for support vector machines: Advantages and disadvantages of the Machine Learning Theory”, The 2010 International Joint Conference on Neural Networks (IJCNN), 2010.
  22. Erik Cambria, Guang-Bin Huang, ”Extreme Learning Machines”, IEEE Computer Society, 2013


Extreme machine learning, feedforward network, neural network, classification