CFP last date
20 May 2024
Reseach Article

Dimensionality Reduction: An Effective Technique for Feature Selection

by Swati A Sonawale, Roshani Ade
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 117 - Number 3
Year of Publication: 2015
Authors: Swati A Sonawale, Roshani Ade
10.5120/20535-2893

Swati A Sonawale, Roshani Ade . Dimensionality Reduction: An Effective Technique for Feature Selection. International Journal of Computer Applications. 117, 3 ( May 2015), 18-23. DOI=10.5120/20535-2893

@article{ 10.5120/20535-2893,
author = { Swati A Sonawale, Roshani Ade },
title = { Dimensionality Reduction: An Effective Technique for Feature Selection },
journal = { International Journal of Computer Applications },
issue_date = { May 2015 },
volume = { 117 },
number = { 3 },
month = { May },
year = { 2015 },
issn = { 0975-8887 },
pages = { 18-23 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume117/number3/20535-2893/ },
doi = { 10.5120/20535-2893 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T22:58:20.955864+05:30
%A Swati A Sonawale
%A Roshani Ade
%T Dimensionality Reduction: An Effective Technique for Feature Selection
%J International Journal of Computer Applications
%@ 0975-8887
%V 117
%N 3
%P 18-23
%D 2015
%I Foundation of Computer Science (FCS), NY, USA
Abstract

For knowledge gaining the dimensionality reduction is a significant technique. It has been observed that most of the time dataset is multidimensional and larger in size. When we are using same dataset for classification it may create wrong results and it may also requires more requirements in terms of storage as well as processing capability. Most of the features present are redundant, inconsistent and degrade the performance. To increase the effectiveness of classification these duplicate and inconsistent features must be removed. In this research we have introduced a new method for dealing with the problem of dimensionality reduction. By reducing the unrelated (irrelevant) and unnecessary features related to data, or by means of effectively merging original features to produce a smaller set of feature with more discriminative control, dimensionality reduction methods convey the instant effects of rapid the data mining algorithms, better performance, and increase in unambiguous of data model

References
  1. Z. Zhao and H. Liu, Spectral Feature Selection for Data Mining, USA: Chapman and Hall-CRC, 2012.
  2. I. . Jolliffe, Principal Component Analysis, USA: Springer, 2002.
  3. X. He and P. Niyogi, "Locality preserving projections," in Proc. NIPS, 2004.
  4. M. Belkin and P. Niyogi, "Laplacian eigenmaps and spectral techniques for embedding and clustering," in Proc. NIPS, 2002.
  5. I Guyon and A. Elisseeff, "An introduction to variable and feature selection," J. Mach. Learn. Res. , vol. 3, pp. 1157–1182, Mar. 2003.
  6. J. G. Dy and C. E. Brodley, "Feature selection for unsupervised learning," J. Mach. Learn. Res. , vol. 5, Aug. 2004, pp. 845–889.
  7. R. O. Duda, P. E. Hart, and D. G. Stork, Pattern Classification, New York, NY, USA: Wiley Interscience, 2000.
  8. M. Robnik-Sikonja and I. Kononenko, "Theoretical and empirical analysis of relief and relieff," Mach. Learn. , vol. 53, no. 1–2, pp. 23–69, 2003.
  9. L. Yu and H. Liu, "Efficient feature selection via analysis of relevance and redundancy," J. Mach. Learn. Res. , vol. 5, Oct. 2004, pp. 1205–1224.
  10. Z. Zhao and H. Liu, "Spectral feature selection for supervised and unsupervised learning," in Proc. 24th Int. Conf. Mach. Learn. , Corvallis, OR, USA, 2007.
  11. X. He, D. Cai, and P. Niyogi, "Laplacian score for feature selection," in Proc. NIPS, Vancouver, Canada, 2005.
  12. L. Song, A. Smola, A. Gretton, J. Bedo, and K. Borgwardt, "Feature selection via dependence maximization," J. Mach. Learn. Res. , vol. 13, no. 1, Jan. 2012, pp. 1393–1434.
  13. Z. Zhao and H. Liu, "Semi-supervised feature selection via spectral analysis," in Proc. SIAM Int. Conf. Data Mining, Tempe, AZ, USA, 2007, pp. 641–646.
  14. D. Zhang, Z. Zhou, and S. Chen, "Semi-supervised Dimensionality reduction," in Proc. SIAM Int. Conf. Data Mining, Pittsburgh, PA, USA, 2007.
  15. O. Chapelle, B. SchÄolkopf, and A. Zien, editors. Semi- Supervised Learning. MIT Press, Cambridge, 2006.
  16. S. BASU, M. BILENKO, AND R. MOONEY, A probabilistic framework for semi-supervised clustering, in KDD'04, Seattle, WA, 2004, pp. 59–68.
  17. U. Brefeld, T. G¨A RTNER, T. SCHEFFER, AND S. WROBEL, Efficient co-regularized least squares regression, in ICML'06, Pittsburgh, PA, 2006, pp. 137–144.
  18. K. WAGSTAFF, C. CARDIE, S. ROGERS, AND S. SCHROEDL, Constrained k-means clustering with background knowledge, in ICML'01, Williamstown, MA, 2001, pp. 577–584.
  19. T. ZHANG AND R. K. ANDO, Analysis of spectral kernel design based semi-supervised learning, in NIPS 18, MIT Press, Cambridge, MA, 2006, pp. 1601–1608.
  20. Z. -H. ZHOU AND M. LI, Semi-supervised learning with co-training, in IJCAI'05, Edinburgh, Scotland, 2005.
  21. X. ZHU, Semi-supervised learning literature survey, Tech. Report 1530, Department of Computer Sciences, University of Wisconsin at Madison, Madison, WI, 2006. http://www. cs. wisc. edu/»jerryzhu/pub/ssl survey. pdf.
  22. R. O. Duda, P. E. Hart, and D. G. Stork. Pattern Classification. John Wiley & Sons, New York, 2 edition, 2001.
  23. J. G. Dy and etal. Unsupervised feature selection applied to content-based retrieval of lung images. Transactions on pattern Analysis and Machine Intelligence, 25(3):373–378, 2003.
  24. B. Efron, T. Hastie, I. Johnstone, and R. Tibshirani. Least angle regression. Annals of Statistics, 32:407–49, 2004.
  25. G. Forman. An extensive empirical study of feature selection metrics for text classification. Journal of Machine Learning Research, 3:1289–1305, 2003.
  26. A. Gretton, O. Bousquet, A. Smola, and B. Scholkopf. Measuring statistical dependence with hilbert-schmidt norms. In Proceedings of ALT, 2005.
  27. Guyon and A. Elisseeff. An introduction to variable and feature selection. Journal of Machine Learning Research, 3:1157–1182, 2003.
  28. X. He, D. Cai, and P. Niyogi. Laplacian score for feature selection. In Advances in Neural Information Processing Systems 18, 2005.
  29. F. Nie, S. Xiang, Y. Jia, C. Zhang, and S. Yan. Trace ratio criterion for feature selection. In Proceedings of Conference on Artificial Intelligence (AAAI), 2008.
  30. Y. Saeys and etal. A review of feature selection techniques in bioinformatics. Bioinformatics, 23(19):2507–2517, 2007.
  31. M. R. Sikonja and I. Kononenko. Theoretical and empirical analysis of Relief and Relief. Machine Learning, 53:23–69, 2003.
  32. L. Song, A. Smola, A. Gretton, J. Bedo, and K. Borgwardt. Feature selection via dependence maximization. Journal of Machine Learning Research,
  33. L. Yu and H. Liu. Feature selection for high-dimensional data: A fast correlation-based filter solution. In Proceedings of the ICML, 2003.
  34. Z. Zhao and H. Liu. Spectral feature selection for supervised and unsupervised learning. In Proceedings of the ICML, 2007.
  35. H. Li, T. Jiang, and K. Zhang, "Efficient and Robust Feature Extraction by Maximum Margin Criterion," Proc. Conf. Advances in Neural Information Processing Systems, 2004.
  36. Jun Yan, Benyu Zhang, Ning Liu, Shuicheng Yan, Qiansheng Cheng, Weiguo Fan, Qiang Yang, Wen si Xi, and Zheng Chen, "Effective and Efficient Dimensionality Reduction for Large-Scale and Streaming Data Preprocessing", Mar. 2006
Index Terms

Computer Science
Information Sciences

Keywords

Dimension reduction Fuzzy ARTMAP Feature selection Feature extraction Supervised and Unsupervised techniques semi-supervised techniques.