Call for Paper - January 2024 Edition
IJCA solicits original research papers for the January 2024 Edition. Last date of manuscript submission is December 20, 2023. Read More

Performance Analysis of Classification Tree Learning Algorithms

International Journal of Computer Applications
© 2012 by IJCA Journal
Volume 55 - Number 6
Year of Publication: 2012
D. L. Gupta
A. K. Malviya
Satyendra Singh

D L Gupta, A K Malviya and Satyendra Singh. Article: Performance Analysis of Classification Tree Learning Algorithms. International Journal of Computer Applications 55(6):39-44, October 2012. Full text available. BibTeX

	author = {D. L. Gupta and A. K. Malviya and Satyendra Singh},
	title = {Article: Performance Analysis of Classification Tree Learning Algorithms},
	journal = {International Journal of Computer Applications},
	year = {2012},
	volume = {55},
	number = {6},
	pages = {39-44},
	month = {October},
	note = {Full text available}


Classification is a supervised learning approach, which maps a data item into predefined classes. There are various classification algorithms proposed in the literature. In this paper authors have used four classification algorithms such as J48, Random Forest (RF), Reduce Error Pruning (REP) and Logistic Model Tree (LMT) to classify the "WEATHER NOMINAL" open source Data Set. Waikato Environment for Knowledge Analysis (WEKA) has been used in this paper for the experimental result and they found that Random Forest algorithm classify the given data set better than the other algorithms for this specific data set. In this paper, the performance of classifier algorithms is evaluated for 5 fold cross validation test.


  • J. R. Quinlan, "C4. 5: Programs for Machine Learning", San Mateo,CA, Morgan Kaufmann Publishers,1993.
  • L. Breiman, "Random Forests. Machine Learning," vol. 45(1), pp. 5-32, 2001.
  • F. Esposito, D. Malerba, and G. Semeraro, "A comparative Analysis of Methods for Pruning Decision Trees", IEEE transactions on pattern analysis and machine intelligence, Vol. 19(5), pp. 476-491, 1997.
  • J. Han and M. Kamber, "Data Mining: Concept and Techniques", Morgan Kaufmann Publishers, 2004.
  • WEKA:http//www. cs. waikato. ac. nz/ml/weka.
  • T. K. Ho, " The Random Subspace Method for constructing Decision Forest",IEEE Transcation on Pattern Analysis and Machine Intelligence,Vol. 20(8),pp. (832-944),1998.
  • Random Forest by Leo Breiman and Adele Cutler:http://www. stat. berkeley. edu/~breiman/RandomForests/cc_home. htm.
  • G. Biau, L. Devroye, G. Lugosi, "Consisting of Random Forests and other Averaging Classifiers," Journal of Machine Learning Research, 2008.
  • J. R. Quinlan, "Induction of Decession Trees : Machine Learning",vol. 1,pp. 81-106,1986.
  • F. Livingston, "Implementation of Breiman's Random Forest Machine Learning algorithm," Machine learning Journal, 2008.
  • J. R. Quinlan, " Simplifying decision trees", Internal Journal of Human Computer Studies,Vol. 51, pp. 497-491, 1999.
  • N. Landwehr, M. Hall, and E. Frank, " Logistic model trees". for Machine Learning. ,Vol. 59(1-2),pp. 161-205, 2005.
  • N. Laves son and P. Davidson, "Multi-dimensional measures function for classifier performance", 2nd. IEEE International conference on intelligent system, pp. 508-513, 2004.