CFP last date
20 June 2024
Reseach Article

Incremental Feature Subsetting useful for Big Feature Space Problems

by P. Nagabhushan, Preeti Mahadev
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 97 - Number 12
Year of Publication: 2014
Authors: P. Nagabhushan, Preeti Mahadev
10.5120/17057-7392

P. Nagabhushan, Preeti Mahadev . Incremental Feature Subsetting useful for Big Feature Space Problems. International Journal of Computer Applications. 97, 12 ( July 2014), 8-17. DOI=10.5120/17057-7392

@article{ 10.5120/17057-7392,
author = { P. Nagabhushan, Preeti Mahadev },
title = { Incremental Feature Subsetting useful for Big Feature Space Problems },
journal = { International Journal of Computer Applications },
issue_date = { July 2014 },
volume = { 97 },
number = { 12 },
month = { July },
year = { 2014 },
issn = { 0975-8887 },
pages = { 8-17 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume97/number12/17057-7392/ },
doi = { 10.5120/17057-7392 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T22:23:54.731303+05:30
%A P. Nagabhushan
%A Preeti Mahadev
%T Incremental Feature Subsetting useful for Big Feature Space Problems
%J International Journal of Computer Applications
%@ 0975-8887
%V 97
%N 12
%P 8-17
%D 2014
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Dimensionality Reduction process is a means to overcome curse of dimensionality in general. When all features are available together, it is a way to extract knowledge from a population in a big feature space. On the contrary, dimensionality reduction is intriguing when update to feature space is streaming and the question arises whether one could reduce the feature space as and when the features become available instead of waiting for all the features to arrive . This could not only enable the creation of knowledge that can incrementally align with the incremental access to feature space, but would also facilitate decision making locally at every incremental stage. While facilitating the local decision making parameters, it would eventually generate the most optimal reduced feature space. Moving in this direction, the possibility of implementing feature subsetting in an incremental framework is explored. The incremental streaming could be due to the temporal arrival of features or due to collection of features arriving from distributed sources. In this paper, the adoption of incremental dimensionality reduction model is also explored to observe the complexity reduction of working with a big feature space. The speciality of the proposed incremental framework is that the dimensionality reduction is performed to obtain a cumulative reduced feature space at every stage without having to look back at the earlier features.

References
  1. Edilson Delgado-Trejos et,al, Dimensionality Reduction Oriented Toward the Feature Visualization for Ischemia Detection, IEEE Transactions on Information Technology in BioMedicine, Vol13, No04, 2009.
  2. Chowdhury. F et, al, Single-pass incremental and interactive mining for weighted frequent patterns, 2012.
  3. J. Han, M. Kamber, "Data Mining: Concepts and Techniques," Third Edition, Elsevier Inc. , Rajkamal Electric Press, 2011.
  4. S. Santhosh Kumar et,al "Development of an Efficient Clustering Technique for Colon Dataset" IJEIT, Vol1, Issue 5, May 2012.
  5. Bing Lu. , Wynne Hsu. , Yiming Ma, Integrating Classification and Association Rule Mining, KDD proceeding AAAI, 1998.
  6. Francesco Palumbo. , Alfonso Iodice D'Enza, Clustering and Dimensionality Reduction to discover interesting patterns in Binary data, Proceedings of the 32nd Annual Conference of the Gesellschaft für Klassifikation e. V. , 2008
  7. P. Adraig Cunningham, Dimension Reduction ,Machine Learning Techniques for Multimedia Cognitive Technologies , pp 91-112, 2008
  8. Francesco Palumbo. , Alfonso Iodice D'Enza, Clustering and Dimensionality Reduction to discover interesting patterns in Binary data, Proceedings of the 32nd Annual Conference of the Gesellschaft für Klassifikation e. V. , 2008
  9. http://en. wikipedia. org/wiki/Dimensionality_reduction
  10. http://www. cs. binghamton. edu/~lyu/SDM07/DR-SDM07. pdf
  11. Syed Zakir Ali. , P Nagabhushan. , Pradeep Kumar R, Incremental datamining using Clustering Intelligent methods of Fusing the Knowledge During Incremental Learning via Clustering in A Distributed Environment , PhD Thesis, 2010
  12. J. Bailey , E. Loekito,Efficient incremental mining of contrast patterns in changing data, 2010.
  13. T. Gharib et,al, An efficient algorithm for incremental mining of temporal association rules, 2010.
  14. Syed Zakir Ali. , P Nagabhushan. , Pradeep Kumar R, Regression based Incremental Learning through Cluster Analysis of Temporal data, International Conference on Data Mining (DMIN) 2009.
  15. D. Dudek, RMAIN:Association rules maintenance without reruns through data, 2009.
  16. C. Hsu. , Y. Huang. Incremental Clustering of mixed databased on distance hierarchy, 2008
  17. F. Masseglia. , P. Poncelet. , M. Teisseire, Incremental mining of sequential patterns in large databases, 2003.
  18. Zhang. M et,al, Efficient algorithms for incremental update of frequent sequences, 2002.
  19. N. Sarda, N. V. Srinivas, An adaptive algorithm for incremental mining of association rules, in: Proceedings of the 9th International Workshop on Database and Expert Systems Applications, Indian Institute of Technology, Bombay, 1998
  20. Jieping Ye. , IDR/QR: an incremental dimension reduction algorithm via QR decomposition, Knowledge and Data Engineering, IEEE Transactions , Vol17, Issue09, 2005
  21. Martin H. C. Law, Anil K. Jain, Incremental Nonlinear Dimensionality Reduction by Manifold Learning, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 3,2006
  22. Sheng Uei Guan et al. , Incremental Learning with Respect to New Incoming Input Features, Neural Processing Letters 14, 2001
  23. Courses. cs. tamu. edu/rgutier/cs790_w02/l5. pdf – United States
  24. http://dss. princeton. edu/training
  25. Guan SU, Li S, Incremental learning with respect to new incoming input attributes, Neural Processing Letters 14(3):241–260,2001
  26. T. Wang and S. U. Guan, "Feature ordering for neural incremental attribute learning based on Fisher's linear discriminant," in Proceedings of the 5th International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC '13), Hangzhou, China, 2013
  27. Murat Karabatak et,al, A new feature selection method based on association rules for diagnosis of erythemato-squamous diseases, Expert Systems with Applications, Vol36, Issue10, 2009
  28. Hua-Liang Wei. , Billings. S. A, Feature Subset Selection and Ranking for Data Dimensionality Reduction, Pattern Analysis and Machine Intelligence, IEEE Transactions, Vol 29, Issue01, 2007
  29. Bartosz Krawczyk. , Pawel Filipczuk. , Michal Wozniak, Adaptive Splitting and Selection Algorithm for Classification of Breast Cytology Images. ICCCI (1), 2012.
  30. Gulsen Taskin Kaya et. al, Feature selection by high dimensional model representation and its application to remote sensing Geoscience and Remote Sensing Symposium (IGARSS), IEEE International, 2012
  31. http://www. ime. unicamp. br/~wanderson/Artigos/feature_selection_in_mining. pdf
  32. Minyoung Kin. , Correlation-based incremental visual tracking, The Journal of the Pattern Recognition Society, Vol45, Issue03, 2012
  33. Liu. ,Yu, Toward Integrating feature selection algorithms for classification and clustering, Journal of Knowledge and data engineering, Vol17,Issue04
  34. Tin Wang, Feature Ordering for Neural Incremental Attribute Learning Based on Fisher's Linear Discriminant (Intelligent Human- Machine Systems and Cybernetics (IHMSC), 5th International Conference on vol2, 2013
  35. Huan Liu et al. ,Incremental Feature selection, Journal Applied Intelligence, Vol 9 Issue 3 ,pp 217-230, 1998
  36. Michal WozNiak. , Bartosz Krawcztk, Combined Classifier based on Feature space partitioning, Int. J. Appl. Math. Comput. Sci. ,Vol22, No04, 2012.
  37. S. Kotsiantis. , K. Patriarcheas. , and M. Xenos. A combinational incremental ensemble of classifiers as a technique for predicting students' performance in distance education, 2010.
  38. D. Brauckho et al. , Anomaly extraction in backbone networks using association rules ,In Proceedings. of IMC, 2009.
  39. Murat Karabatak ,M. CevdetInce An expert system for detection of breast cancer based on association rules And neural network, journal, Expert Systems with Applications 36(2009) 3465–3469, 2009.
  40. Agrawal, H. Mannila, R. Srikant, H. Toivonen, and A. Verkamo. Fast discovery of association rules. In U. Fayyad, G. Piatetsky-Shapiro, P. Smith, and R. Uthurusamy, editors, Advances in Knowledge Discovery and Data Mining, pages 307-328. AAAI/MIT Press, 1996
  41. Minyoung Kin. , Correlation-based incremental visual tracking, The Journal of the Pattern Recognition Society, Vol45, Issue03, 2012
  42. Flip Korn et al. , Quantifiable datamining using PCA, VLDB Journal: Very Large Data Bases, 1997
  43. Anil K. Jain's talk: Clustering Big Data, University of Notre Dame, Nov. 29, 2012
  44. http://www. coffeechoiceguide. co. uk/coffee-regions. htm
  45. http://www. fusion2014. org/tutorials/t14-mtpa
  46. Yates, F. ; Mather, K. , "Ronald Aylmer Fisher 1890-1962". Biographical Memoirs of Fellows of the Royal Society 9: 91–129,1963.
  47. P. Nagabhushan, An efficient method for classifying remotely sensed data (incorporating dimensionality reduction), Ph. D thesis, Universityof Mysore, 1988
  48. Rangarajan, Lalitha, and P. Nagabhushan. "Dimensionality reduction of multidimensional temporal data through regression. " Pattern recognition letters 25. 8 (2004): 899-910.
  49. http://archive. ics. uci. edu/ml/
  50. P. Nagabhushan, H. N. Meenakshi. ,"Target Class Supervised Feature Subsetting", International Journal of Computer Applications,Volume 91, Issue12, 2014
  51. http://www2. cs. uregina. ca/~dbd/cs831/notes/confusion_matrix/confusion_matrix. html
Index Terms

Computer Science
Information Sciences

Keywords

Big Feature Space Incremental Dimensionality Reduction Feature Selection Optimal Feature Subset Local Knowledge