CFP last date
20 May 2024
Call for Paper
June Edition
IJCA solicits high quality original research papers for the upcoming June edition of the journal. The last date of research paper submission is 20 May 2024

Submit your paper
Know more
Reseach Article

Improved KNN with Feedback Support

by Shubham Mishra, Harshali Patil
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 177 - Number 1
Year of Publication: 2017
Authors: Shubham Mishra, Harshali Patil
10.5120/ijca2017914732

Shubham Mishra, Harshali Patil . Improved KNN with Feedback Support. International Journal of Computer Applications. 177, 1 ( Nov 2017), 1-3. DOI=10.5120/ijca2017914732

@article{ 10.5120/ijca2017914732,
author = { Shubham Mishra, Harshali Patil },
title = { Improved KNN with Feedback Support },
journal = { International Journal of Computer Applications },
issue_date = { Nov 2017 },
volume = { 177 },
number = { 1 },
month = { Nov },
year = { 2017 },
issn = { 0975-8887 },
pages = { 1-3 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume177/number1/28587-2017914732/ },
doi = { 10.5120/ijca2017914732 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T00:44:39.285357+05:30
%A Shubham Mishra
%A Harshali Patil
%T Improved KNN with Feedback Support
%J International Journal of Computer Applications
%@ 0975-8887
%V 177
%N 1
%P 1-3
%D 2017
%I Foundation of Computer Science (FCS), NY, USA
Abstract

This paper shows, a new method has been introduce to enhancing the performance of K-Nearest Neighbor is implemented which uses K neighbors for classifying the new data. This new classification method is called Improved K-Nearest Neighbor, IKNN. Inspired the traditional KNN algorithm, the main idea is to provide feedback that is for next iteration it should also consider previous classifications. Other than providing the feedback it also modified its distance calculating formula. In this method a weight vector for class labels vector is initialized. For each iteration this weight vector matrix will play major role for data classification. Experiments show the improvement in the accuracy of the IKNN algorithm.

References
  1. L. I. Kuncheva, Combining Pattern Classifiers, Methods and Algorithms, New York: Wiley, 2005.
  2. H. Parvin, H. Alizadeh, B. Minaei-Bidgoli and M. Analoui, “An Scalable Method for Improving the Performance of Classifiers in Multiclass Applications by Pairwise Classifiers and GA”, In Proc. Of the Int. Conf. on Networked Computing and advanced Information Management by IEEE Computer Society, NCM 2008, Korea, Sep.2008, (in press).
  3. H. Parvin, H. Alizadeh, M. Moshki, B. Minaei-Bidgoli and N. Mozayani, “Divide & Conquer Classification and Optimization b Genetic Algorithm”, In Proc. of the Int. Conf. on Convergence and hybrid Information Technology by IEEE Computer Society, ICCIT08, Nov. 11-13, 2008, Busan, Korea (in press).
  4. H. Parvin, H. Alizadeh, B. Minaei-Bidgoli and M. Analoui, “CCHR: Combination of Classifiers using Heuristic Retraining”, In Proc. of the Int. Conf. on Networked Computing and advanced Information Management by IEEE Computer Society, NCM 2008, Korea, Sep. 2008, (in press).
  5. H. Parvin, H. Alizadeh and B. Minaei-Bidgoli, “A New Approach to Improve the Vote-Based Classifier Selection”, In Proc. of the Int. Conf. on Networked Computing and advanced Information Management by IEEE Computer Society, NCM 2008, Korea, Sep. 2008, (in press).
  6. H. Alizadeh, M. Mohammad and B. Minaei-Bidgoli, “Neural Network Ensembles using Clustering Ensemble and Genetic Algorithm”, In Proc. of the Int. Conf. on Convergence and hybrid Information Technology by IEEE Computer Society, ICCIT08, Nov. 11-13, 2008, Busan, Korea (in press).
  7. B.V. Daresay, Nearest Neighbor pattern classification techniques for removing probability density problem, Las Alamitos, LA: IEEE Computer Society Press.
  8. Fix, E., Hodges, J.L. Discriminatory analysis, SVM algorithm nonparametric discrimination: Consistency properties. Technical Report 4, USAF School of Aviation Medicine, Randolph Field, Texas, 1951.
  9. Cover, T.M., Hart, P.E. Nearest neighbor pattern classification. IEEE Trans. Inform. Theory, IT-13(1):21–27, 1967.
  10. Hellman, M.E. The nearest neighbor classification rule with a reject option. IEEE Trans. Syst. Man Cybern., 3:179–185, 1970.
  11. McConnell, Steve (30 November 2009). Code Complete, p. 100. ISBN 9780735636972.
  12. Kuhlman, Dave. "A Python Book: Beginning Python, Advanced Python, and Python Exercises".
  13. Bailey, T., Jain, A. A note on distance-weighted k-nearest neighbor rules. IEEE Trans. Systems, Man, Cybernetics, Vol. 8, pp. 311-313, 1978.
  14. Bermejo, S. Cabestany, J. Adaptive soft k-nearest-neighbor classifiers. Pattern Recognition, Vol. 33, pp. 1999-2005, 2000.
  15. Jozwik, A. A learning scheme for a fuzzy k-nn rule. Pattern Recognition Letters, 1:287–289, 1983.
  16. Keller, J.M., Gray, M.R., Givens, J.A. A fuzzy knn neighbor algorithm. IEEE Trans. Syst. Man Cyber., SMC-15(4):580–585.
  17. ITQON, K. Shunichi and I. Satoru, Improving Performance of k-Nearest Neighbor Classifier by Test Features, Springer Transactions of the Institute of Electronics, Information and Communication Engineers 2001.
  18. R. O. Duda, P. E. Hart and D. G. Stork, Pattern Classification , John Wiley & Sons, 2000.
  19. E. Gose, R. Johnson bough and S. Jots, Pattern Recognition and Image Analysis, Prentice Hall, Inc., Upper Saddle River, NJ 07458, 1996.
  20. Blake, C.L.Merz, C.J. UCI Repository of machine learning databases http://www.ics.uci.edu/~mlearn/ MLRepository.html, (1998).
  21. S. Aeberhard, D. Coomans and O. de Val, “Comparison of Classifiers in High Dimensional Settings”, Tech. Rep. no. 92-02, Dept. of Computer Science and Dept. of Mathematics and Statistics, James Cook University of North Queensland.
Index Terms

Computer Science
Information Sciences

Keywords

IKNN KNN Classification Improved K-Nearest Neighbor Feedback.