CFP last date
22 April 2024
Reseach Article

Investigation the Effect of Particle Swarm Optimization in Performance of Mixture of Experts

by Dimple Rani, Javad Hatami, Diba Meysamiazad
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 95 - Number 2
Year of Publication: 2014
Authors: Dimple Rani, Javad Hatami, Diba Meysamiazad
10.5120/16569-6246

Dimple Rani, Javad Hatami, Diba Meysamiazad . Investigation the Effect of Particle Swarm Optimization in Performance of Mixture of Experts. International Journal of Computer Applications. 95, 2 ( June 2014), 28-32. DOI=10.5120/16569-6246

@article{ 10.5120/16569-6246,
author = { Dimple Rani, Javad Hatami, Diba Meysamiazad },
title = { Investigation the Effect of Particle Swarm Optimization in Performance of Mixture of Experts },
journal = { International Journal of Computer Applications },
issue_date = { June 2014 },
volume = { 95 },
number = { 2 },
month = { June },
year = { 2014 },
issn = { 0975-8887 },
pages = { 28-32 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume95/number2/16569-6246/ },
doi = { 10.5120/16569-6246 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T22:18:25.282250+05:30
%A Dimple Rani
%A Javad Hatami
%A Diba Meysamiazad
%T Investigation the Effect of Particle Swarm Optimization in Performance of Mixture of Experts
%J International Journal of Computer Applications
%@ 0975-8887
%V 95
%N 2
%P 28-32
%D 2014
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Mixture of experts (ME) is one of the most popular and interesting combining methods, which has great potential to improve performance in machine learning. ME is established based on the divide-and-conquer principle in which the problem space is divided between a few neural network experts, supervised by a gating network. In earlier works on ME, different strategies were developed to divide the problem space between the experts. As result, we have introduced a new method based on the principles of Particle Swarm Optimization (PSO) as a learning step in ME. In this paper, different aspects of the proposed method are compared with the common version of ME. The result carried out from this paper shows that the new method is robust to the variation of ensemble complexity in terms of the number of individual experts, and the number of hidden units.

References
  1. Kuncheva, L. , M. Skurichina, and R. P. Duin, An experimental study on diversity for bagging and boosting with linear classifiers. Information fusion, 2002. 3(4): p. 245-258.
  2. Woods, K. , K. Bowyer, and W. P. Kegelmeyer Jr. Combination of multiple classifiers using local accuracy estimates. in Computer Vision and Pattern Recognition, 1996. Proceedings CVPR'96, 1996 IEEE Computer Society Conference on. 1996. IEEE.
  3. Jacobs, R. A. , et al. , Adaptive mixtures of local experts. Neural computation, 1991. 3(1): p. 79-87.
  4. Salimi, H. , et al. , Extended Mixture of MLP Experts by Hybrid of Conjugate Gradient Method and Modified Cuckoo Search. International Journal of Artificial Intelligence & Applications, 2012. 3(1).
  5. Chen, K. , L. Xu, and H. Chi, Improved learning algorithms for mixture of experts in multiclass classification. Neural networks, 1999. 12(9): p. 1229-1252.
  6. Hong, X. and C. J. Harris, A mixture of experts network structure construction algorithm for modelling and control. Applied intelligence, 2002. 16(1): p. 59-69.
  7. Güler, ?. and E. D. Übeyli, A modified mixture of experts network structure for ECG beats classification with diverse features. Engineering Applications of Artificial Intelligence, 2005. 18(7): p. 845-856.
  8. Kartalopoulos, S. V. and S. V. Kartakapoulos, Understanding neural networks and fuzzy logic: basic concepts and applications. 1997: Wiley-IEEE Press.
  9. Haykin, S. , Neural networks: a comprehensive foundation. 1994: Prentice Hall PTR.
  10. Hu, X. , Y. Shi, and R. Eberhart. Recent advances in particle swarm. in IEEE congress on evolutionary computation. 2004. Portland.
  11. Gudise, V. G. and G. K. Venayagamoorthy. Comparison of particle swarm optimization and backpropagation as training algorithms for neural networks. in Swarm Intelligence Symposium, 2003. SIS'03. Proceedings of the 2003 IEEE. 2003. IEEE.
  12. Yao, X. , Evolving artificial neural networks. Proceedings of the IEEE, 1999. 87(9): p. 1423-1447.
  13. Van den Bergh, F. and A. P. Engelbrecht, Cooperative learning in neural networks using particle swarm optimizers. South African Computer Journal, 2000(26): p. p. 84-90.
  14. Dailey, M. N. and G. W. Cottrell, Organization of face and object recognition in modular neural network models. Neural Networks, 1999. 12(7): p. 1053-1074.
  15. Aflakparast, M. , et al. , Cuckoo search epistasis: a new method for exploring significant genetic interactions. Heredity, 2014.
  16. Salimi, H. and D. Giveki, Farsi/Arabic handwritten digit recognition based on ensemble of SVD classifiers and reliable multi-phase PSO combination rule. International Journal on Document Analysis and Recognition (IJDAR), 2013. 16(4): p. 371-386.
  17. Giveki, D. , et al. , Detection of erythemato-squamous diseases using AR-CatfishBPSO-KSVM. Signal & Image Processing, 2012. 2(4): p. 57-72.
  18. Giveki, D. , et al. , Automatic detection of diabetes diagnosis using feature weighted support vector machines based on mutual information and modified cuckoo search. arXiv preprint arXiv:1201. 2173, 2012.
Index Terms

Computer Science
Information Sciences

Keywords

Mixture of Experts Neural Network Particle Swarm Optimization Classification