CFP last date
22 April 2024
Reseach Article

Combining Akaike’s Information Criterion (AIC) and the Golden-Section Search Technique to find Optimal Numbers of K-Nearest Neighbors

by Asha Gowda Karegowda, M.A.Jayaram, A.S. Manjunath
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 2 - Number 1
Year of Publication: 2010
Authors: Asha Gowda Karegowda, M.A.Jayaram, A.S. Manjunath
10.5120/609-859

Asha Gowda Karegowda, M.A.Jayaram, A.S. Manjunath . Combining Akaike’s Information Criterion (AIC) and the Golden-Section Search Technique to find Optimal Numbers of K-Nearest Neighbors. International Journal of Computer Applications. 2, 1 ( May 2010), 80-87. DOI=10.5120/609-859

@article{ 10.5120/609-859,
author = { Asha Gowda Karegowda, M.A.Jayaram, A.S. Manjunath },
title = { Combining Akaike’s Information Criterion (AIC) and the Golden-Section Search Technique to find Optimal Numbers of K-Nearest Neighbors },
journal = { International Journal of Computer Applications },
issue_date = { May 2010 },
volume = { 2 },
number = { 1 },
month = { May },
year = { 2010 },
issn = { 0975-8887 },
pages = { 80-87 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume2/number1/609-859/ },
doi = { 10.5120/609-859 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T19:50:40.567226+05:30
%A Asha Gowda Karegowda
%A M.A.Jayaram
%A A.S. Manjunath
%T Combining Akaike’s Information Criterion (AIC) and the Golden-Section Search Technique to find Optimal Numbers of K-Nearest Neighbors
%J International Journal of Computer Applications
%@ 0975-8887
%V 2
%N 1
%P 80-87
%D 2010
%I Foundation of Computer Science (FCS), NY, USA
Abstract

K-nearest neighbor (KNN) is one of the accepted classification tool . Classfication is one of the foremost machine-learning tools used in field of medical data mining. However, one of the most complicated tasks in developing a KNN is determining the optimal number of nearest neighbors, which is usually obtained by repeated experiments for different values of K, till the minimum error rate is achieved. This paper describes the novel approach of finding optimal number of nearest neighbors for KNN classifier by combining Akaike’s information criterion (AIC) and the golden-section search technique. The optimal model so developed was used for categorization of a variety of medical data garnered from UC Irvine Machine Learning Repository.

References
  1. .Akaike H. (1974). A New Look at Statistical Model Identification. IEEE Transactions on Automatic Control , AU-19, 716-723.
  2. Akaie H. (1973) Information theory as an extension of the maximum likelihood principle. Second Intl Symp Inf Theory ,267-81.
  3. Asha Gowda Karegowda and M.A.Jayaram. ( 2009). Cascading GA & CFS for feature subset selection in Medial data mining. IEEE International Advance Computing Conference,Patiyala, India.
  4. .D. Goldberg .1989. Genetic Algorithms in Search, Optimization, and Machine learning, Addison Wesley,
  5. J. Han And M. Kamber. (2001). Data Mining: Concepts and Techniques(. San Francisco, Morgan Kauffmann Publishers.
  6. http://en.wikipedia.org/wiki/Golden_section_search
  7. http://www1.ics.uci.edu/~mlearn/MLSummary.html
  8. Liqun Ren , Zhiye Zhao. (2002). An optimal neural network and concrete strength modeling. Advances in Engineering Software 33117-130
Index Terms

Computer Science
Information Sciences

Keywords

Medical Data mining K-nearest neighbor (KNN) Akaike’s information criterion (AIC) Golden-selection Ratio