Call for Paper - September 2022 Edition
IJCA solicits original research papers for the September 2022 Edition. Last date of manuscript submission is August 22, 2022. Read More

Radial Basis Function (RBF) Neural Network Classification based on Consistency Evaluation Measure

Print
PDF
International Journal of Computer Applications
© 2012 by IJCA Journal
Volume 54 - Number 15
Year of Publication: 2012
Authors:
Aye Mya Thandar
Myo Kay Khine
10.5120/8642-2463

Aye Mya Thandar and Myo Kay Khine. Article: Radial Basis Function (RBF) Neural Network Classification based on Consistency Evaluation Measure. International Journal of Computer Applications 54(15):20-23, September 2012. Full text available. BibTeX

@article{key:article,
	author = {Aye Mya Thandar and Myo Kay Khine},
	title = {Article: Radial Basis Function (RBF) Neural Network Classification based on Consistency Evaluation Measure},
	journal = {International Journal of Computer Applications},
	year = {2012},
	volume = {54},
	number = {15},
	pages = {20-23},
	month = {September},
	note = {Full text available}
}

Abstract

Many researchers have been applied artificial neural networks in clinical diagnosis, image analysis, signal analysis, interpretation and various classification problems. Among artificial neural networks, RBF neural network has a single hidden layer and it is used to classify complex problems, whereas an MLP may have one or more hidden layers. Many feature selection methods have become important preprocessing steps to improve training performance and accuracy before classification. Consistency-based feature selection is an important category of feature selection research. This paper presents about RBF neural network classification based on consistency measure for medical datasets. There are irrelevant features in medical dataset and it becomes easier to train RBF network by removing unnecessary features. Therefore, this paper shows higher accuracy, better network performance and less time complexity by using RBF classifier based on consistency based feature selection.

References

  • Antonio Arauzo-Azofra. Jose Manuel Benitez. "Consistency measures for feature selection",J Intell Inf Syst,30:273-292, DOI 10. 1007/s 10844-0037-0, (2008).
  • C. N. Hsu, H. J. Huang and S. Dietrich "The ANNIGMA-Wrapper Approach to Fast Feature Selection for Neural Nets", IEEE Transactions on System, Man and Cybernetics, Part B, vol. 32, no. 2, pp. 207-212, (2004).
  • Chotirat Ann Ratanamahatana & Dimitrios Gunopulos "Scaling up the Naïve Bayesian Classifier: Using Decision Trees for Feature Selection", University of California, . (2004).
  • . Dr. M. G. R,Dr. Jay B. Simha, "Evaluation of Feature Selection Methods for Predictive Modeling Using Neural Networks in Credits Scoring", Int. J. Advanced Networking and Applications, Volume:02,Issue:03,Pages:714-718(2010).
  • Huan Liu, Hiroshi Motoda, Feature Selection: An Ever Evolving Frontier in Data Mining, JMLR: Workshop and Conference Proceedings 10: 4-13 The Fourth Workshop on Feature Selection in Data Mining ,(2010).
  • Liu et al. , H. Liu,H, Motoda, and M. Dash. A monotonic measure for optimal feature selection. In Proceedings of European Conference on Machine Learning,(1998).
  • Pawlak. Rough Sets,Theoretical aspects of reasoning about data. Kluwer Academic Publishers. (1991).
  • Sultan Norman,Siti Mariyam Shamsuddin, and Aboul Ella Hassanien, "Hybrid Learning Enhancement of RBF Network with Partivle Swarm Optimization", Foundations of Comput. Intel. Vol. 1,SCI 201,pp. 381-397,Springer-Verlag Berlin Heidelberg ,(2009).
  • Shifei Ding,Xinzheng Xu,Hong Zhu,"Studies on Optimization Algorithms for some Artificial Neural Networks Based on Genetic Algorithm(GA)",Journal of computers,Vol. 6,No. 5,May (2011).
  • K. Shin and X. M. Xu. Consistency-based feature selection. In 13th International Conference on Knowledge-Based and Intelligent Information & Engineering System,(2009).
  • Anderson, J. A. (2003) . An Introduction to neural networks. Prentice Hall, Chaeng, B. and Titterington, D. M. (1994). Neural networks: A review from a statistical perspective. Statistical Science, 9, 2-54.