CFP last date
22 April 2024
Call for Paper
May Edition
IJCA solicits high quality original research papers for the upcoming May edition of the journal. The last date of research paper submission is 22 April 2024

Submit your paper
Know more
Reseach Article

Increasing Classifier Ensemble Efficiency using KSBC Algorithm

by Elham Masoumi Nogorabi, Hedieh Sajedi
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 70 - Number 13
Year of Publication: 2013
Authors: Elham Masoumi Nogorabi, Hedieh Sajedi
10.5120/12025-8088

Elham Masoumi Nogorabi, Hedieh Sajedi . Increasing Classifier Ensemble Efficiency using KSBC Algorithm. International Journal of Computer Applications. 70, 13 ( May 2013), 43-50. DOI=10.5120/12025-8088

@article{ 10.5120/12025-8088,
author = { Elham Masoumi Nogorabi, Hedieh Sajedi },
title = { Increasing Classifier Ensemble Efficiency using KSBC Algorithm },
journal = { International Journal of Computer Applications },
issue_date = { May 2013 },
volume = { 70 },
number = { 13 },
month = { May },
year = { 2013 },
issn = { 0975-8887 },
pages = { 43-50 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume70/number13/12025-8088/ },
doi = { 10.5120/12025-8088 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T21:32:48.244062+05:30
%A Elham Masoumi Nogorabi
%A Hedieh Sajedi
%T Increasing Classifier Ensemble Efficiency using KSBC Algorithm
%J International Journal of Computer Applications
%@ 0975-8887
%V 70
%N 13
%P 43-50
%D 2013
%I Foundation of Computer Science (FCS), NY, USA
Abstract

The most challenging issue in constructing a classification system based on ensemble is how to construct an appropriate ensemble of basic classifiers. In this paper, a new approach of constructing ensemble namely, KSBC, K-means Based Classifier Selection is introduced. This approach utilizes Bagging algorithm as the producer of basic classifiers. Type of all basic classifiers, decision tree or multi-layer nervous networks are considered and remained unchanged during construction of ensemble. After constructing a large number of basic classifiers, KSBC partitions them with the help of k-means clustering. Afterward, by choosing one classifier from each partition, final ensemble is constructed. Weight voting method is the assembling function of the ensemble. In addition the approaches for selection of a classifier from each partition were analyzed. The effectiveness of sampling rate on the efficiency of combining classification based on clustering of classifiers was also evaluated. Finally, several tests were carried out on a large number of standard datasets in machine learning database. Experimental results illustrate the effectiveness of the proposed method compared to other approaches.

References
  1. Amigo E. , Gonzalo J. , Artiles J. , and Verdejo F, 2011, "Combining Evaluation Metrics via the Unanimous Improvement Ratio and its Application to Clustering Tasks", Journal of Artificial Intelligence Research, vol. 42, pp. 689-718.
  2. Blake C. L. , Merz C. J. , 1998, "UCI Repository of machine learning databases".
  3. Breiman L. , 1996, "Bagging Predictors", Journal of Machine Learning, pp. 123-140.
  4. Breiman L. , 2001, "Random Forests", Machine Learning, pp. 5-32.
  5. Dasgupta S. , 2010, "Which Clustering Do You Want? Inducing Your Ideal Clustering with Minimal Feedback", Journal of Artificial Intelligence Research, vol. 39, pp. 581-632.
  6. Freund Y. , Schapire R. E. , 1997, "A Decision Theoretic Generalization of Online Learning and an Application to Boosting", Journal Computer Systems, vol. 55, no. 1, pp. 119-139.
  7. Giacinto G. , Roli F. , 2001, "An approach to the automatic design of multiple classifier systems", Pattern Recognition Letters, pp. 25–33.
  8. Günter S. , Bunke H. , 2002, "Creation of Classifier Ensembles for Handwritten Word Recognition Using Feature Selection Algorithms", Proceedings of the Eighth International Workshop on Frontiers in Handwriting Recognition, pp. 183-190.
  9. Hatami N. , Ebrahimpour R. , 2007, "Combining Multiple Classifiers: Diversify with Boosting and Combining by Stacking", International Journal of Computer Science and Network Security, vol. 7, no . 1.
  10. Khashei, M. , Bijari, M. , 2010, "An Artificial Neural Network Model for Time series Forecasting", Expert Systems with Applications, pp. 479–489.
  11. Kuncheva L. I. , 2005, "Combining Pattern Classifiers, Methods and Algorithms", Wiley.
  12. Kuncheva, L. I. and Whitaker, C. , 2003, "Measures of diversity in classifier ensembles and their relationship with ensemble accuracy", pp. 181–207.
  13. Kurland O. , Krikon E. , 2011, "The Opposite of Smoothing: A Language Model Approach to Ranking Query-Specific Document Clusters", Journal of Artificial Intelligence Research, vol. 41, PP. 367-395.
  14. Minaei-Bidgoli B. , Topchy A. ,William F. , 2004, "Ensembles of Partitions via Data Resampling".
  15. Minaei-Bidgoli B. , Topchy A. P. , Punch W. F. , 2004, "Ensembles of Partitions via Data Resampling", ITCC, pp. 188-192.
  16. Mokeddem D. , Belbachir H. , 2008, "Distributed Data Mining using Ensemble Learning method".
  17. Parvin H. , Minaei-bidgoli B. , Beigi A. , 2011, "A new classifier ensembles framework", springer, pp. 110-119.
  18. Parvin H. , Minaei-Bidgoli B. , Shahpar H. , 2011, "Classifier Selection by Clustering", 3rd Mexican Conference on Pattern Recognition (MCPR2011), LNCS, Springer, Heidelberg, pp. 60–66.
  19. Pazos, A. B. P. , Gonzalez, A. A. , Pazos, F. M. , 2009, "Artificial NeuroGlial Networks", Encyclopedia of Artificial Intelligence, pp. 167–171.
  20. Peña J. M. , "Finding Consensus Bayesian Network Structures", Journal of Artificial Intelligence Research, vol. 42, pp. 661-687.
  21. Yang T. , 2006, "Computational Verb Decision tree", International Journal of Computational Cognition, vol. 4, no. 4, pp. 34-46
Index Terms

Computer Science
Information Sciences

Keywords

Ensemble method bagging clustering