CFP last date
20 June 2024
Reseach Article

Semi-Supervised Feature Selection with Constraint Sets

Published on March 2017 by Prajakta Kulkarni, S. M. Kamalapur
Emerging Trends in Computing
Foundation of Computer Science USA
ETC2016 - Number 2
March 2017
Authors: Prajakta Kulkarni, S. M. Kamalapur

Prajakta Kulkarni, S. M. Kamalapur . Semi-Supervised Feature Selection with Constraint Sets. Emerging Trends in Computing. ETC2016, 2 (March 2017), 31-34.

author = { Prajakta Kulkarni, S. M. Kamalapur },
title = { Semi-Supervised Feature Selection with Constraint Sets },
journal = { Emerging Trends in Computing },
issue_date = { March 2017 },
volume = { ETC2016 },
number = { 2 },
month = { March },
year = { 2017 },
issn = 0975-8887,
pages = { 31-34 },
numpages = 4,
url = { /proceedings/etc2016/number2/27312-6265/ },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
%0 Proceeding Article
%1 Emerging Trends in Computing
%A Prajakta Kulkarni
%A S. M. Kamalapur
%T Semi-Supervised Feature Selection with Constraint Sets
%J Emerging Trends in Computing
%@ 0975-8887
%V ETC2016
%N 2
%P 31-34
%D 2017
%I International Journal of Computer Applications

In machine learning classification and recognition are crucial tasks. Any object is recognized with the help of features associated with it. Among many features only some leads to classify object correctly. Feature selection is useful technique to detect such specific features. Feature selection is a process of selecting subset of features to reduce number of features (dimensionality reduction). Semi-supervised feature selection is difficult due to scarcity of labeled samples. Here constraint based approach is proposed to efficiently select features from semi-supervised data. Constraint based approach is selected as it incorporates supervised information in processing. In the absence of labels, features can be evaluated based on locality preserving ability. Hence for semi-supervised data, properties of both labeled and unlabeled data are combined tochoose good features. Constraint based Laplacian score is used to find weight of features. To eliminate redundant features mutual information is calculated and graph based method is used to remove redundant features. Classification accuracy for different dataset is measured to check performance of system.

  1. Zhao Z. and Liu H. , C. 2003. Semi-supervised feature selection via spectralanalysis. .
  2. Robnik-Sikonja M. and Kononenko, J. 2003. Theoretical and empiricalanalysis of relief and relief.
  3. Duda R. O. , Hart P. E. , and Stork D. G. , 2000. Pattern Classification.
  4. Yu L. and Liu H. , 2004. Efficient feature selection via analysis of relevance and redundancy. J. Mach. Learn. Res. 5 (Oct. 2004), 1205-1224.
  5. He X. , Cai D. and Niyogi P. , C. 2005. Laplacian score for feature selection.
  6. Zhao Z. and Liu H. , I. C. 2007. Spectral feature selection for supervised and unsupervised learning.
  7. Davidson I. and Basu S. , 2007. A survey of clustering with instance level constraints. ACM transactions on knowledge discovery from data.
  8. Zhang D. , Chen S and Zhou Z. , 2008. Constraint score: a new filter method for feature selection with pairwise constraints. Pattern recognition(2008) 1440-1451.
  9. Zhang D. , Zhou Z. and Chen S. , I. C. 2007. Semi-supervised dimensionality reduction.
  10. Benabdeslem K. and Hindawi M. , 2011. Constrained laplacian score for semi-supervised feature selection. Proc. ECML-PKDD, Athens, Greece (2011), 204-218.
  11. Hindawi M. , Allab K. and Benabdeslem K. , 2011. Constraint selection based semi-supervised feature selection. Proc. IEEE ICDM, Vancouver, BC, Canada (2011), 1080-1085.
  12. Allab K. and Benabdeslem K. , 2011. Constraint selection for semi-supervised topological clustering. Proc. ECML-PKDD, Athens, Greece(2011). 28-43.
  13. Davidson I. , Wagastsff K. and Basu S. , 2006. Measuring constraint set utility for partitional clustering algorithms.
  14. Peng H. , Long F. and Ding C. , 2005. Feature selection basesed on mutual information: criteria of max-dependency, max-relevance and min-redundancy. IEEETrans. Pattern Anal. Mach. Intell. (Aug. 2005), 1226-1238.
  15. Hindawi M. and Benabdeslem K. , 2014. Efficient semi-supervised feature selection: constraint, relevancy and redundancy. IEEE Trans. Knowledge and Data Engg. (May 2014).
  16. Chung F. , 1997. Spectral graph theory.
  17. I. Davidson, K. Wagstaff, and S. Basu, 2006. Measuring constraint set utility for partitional clustering algorithms, in Proc. ECML/PKDD.
  18. Cormen T. H. , Stein C. , Rivest R. L. and Leiserson C. E. , 2001. Introduction to algorithms.
Index Terms

Computer Science
Information Sciences


Constraints Feature Selection Redundant Relevant Semi-supervised