CFP last date
20 May 2024
Reseach Article

JMIM: A Feature Selection Technique using Joint Mutual Information Maximization Approach

Published on March 2017 by Saner Rajlakshmi Sanjay, S. S. Sane
Emerging Trends in Computing
Foundation of Computer Science USA
ETC2016 - Number 4
March 2017
Authors: Saner Rajlakshmi Sanjay, S. S. Sane
1100fd99-1d6d-4383-9966-bfb789819ab6

Saner Rajlakshmi Sanjay, S. S. Sane . JMIM: A Feature Selection Technique using Joint Mutual Information Maximization Approach. Emerging Trends in Computing. ETC2016, 4 (March 2017), 5-10.

@article{
author = { Saner Rajlakshmi Sanjay, S. S. Sane },
title = { JMIM: A Feature Selection Technique using Joint Mutual Information Maximization Approach },
journal = { Emerging Trends in Computing },
issue_date = { March 2017 },
volume = { ETC2016 },
number = { 4 },
month = { March },
year = { 2017 },
issn = 0975-8887,
pages = { 5-10 },
numpages = 6,
url = { /proceedings/etc2016/number4/27321-6274/ },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Proceeding Article
%1 Emerging Trends in Computing
%A Saner Rajlakshmi Sanjay
%A S. S. Sane
%T JMIM: A Feature Selection Technique using Joint Mutual Information Maximization Approach
%J Emerging Trends in Computing
%@ 0975-8887
%V ETC2016
%N 4
%P 5-10
%D 2017
%I International Journal of Computer Applications
Abstract

The process of feature selection is generally used to minimize the size of dataset, to overcome the problem of over fitting and to increase the classifier efficiency. We proposed the JMIM i. e. Joint Mutual Information Maximization algorithm to extract feature and for creation of feature subset efficiently. These algorithms are based on joint mutual information. It follows maximum of minimum strategy. In this paper our aim is to work on utilization of JMIM algorithm, then we compare upcoming outcome with the previously highlighted problems in existed feature selection system. In utilization of JMIM algorithm, we are expecting that our simultaneous processing of feature set selection process will reduces time required for overall execution. As a part of our contribution the process distributed over different clouds that helps in execution and triggers the process.

References
  1. Mohamed Bennasar, Yulia Hicks and RossitzaSetchi, Feature selection using Joint Mutual Information Maximisation, Expert Systems with Application, Volume 42, Issue 22, 1 December 2015.
  2. Roberto Battiti, Using Mutual Information for Selecting Features in Supervised Neural Net Learning, IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 5, NO. 4, JULY 1994.
  3. James Dougherty, Ron Kohavi and MehranSahami, Supervised and unsupervised Discretization of Continuous Features, 1995.
  4. Anil K. Jain, Robert P. W. Duin, and Jianchang Mao, Statistical Pattern Recognition: A Review, IEEE Transactions on Pattern Analysis and Machine Intelligence, VOL. 22, NO. 1, JANUARY 2000
  5. Isabelle Guyon and Andre Elissee_, An Introduction to Variable and Feature Selection, Journal of Machine Learning Research 3 (2003) 1157-1182 2003.
  6. Chris Ding and Hanchuan Peng, Minimum Redundancy Feature Selection from Microarray Gene Expression Data, Proceedings of the Computational Systems Bioinformatics (CSB03) 2003.
  7. Francois Fleuret, Fast Binary Feature Selection with Conditional Mutual Information, Journal of Machine Learning Research 5 (2004)15311555 2004.
  8. Ali El Akadi, Abdeljalil El Ouardighi, and DrissAboutajdine, A Powerful Feature Selection approach based on Mutual Information, IJCSNS International Journal of Computer Science and Network Security,VOL. 8 No. 4, April 2008.
  9. Patrick Emmanuel Meyer, Colas Schretter, and GianlucaBontempi, Information-Theoretic Feature Selection in Microarray Data Using Variable Complementarity, IEEE Journal Of Selected Topics In Signal Processing, Vol. 2, No. 3, June 2008.
  10. Patrick Emmanuel Meyer, Colas Schretter, and GianlucaBontempi, Information-Theoretic Feature Selection in Microarray Data Using Variable Complementarity, IEEE Journal Of Selected Topics In Signal Processing, Vol. 2, No. 3, June 2008.
  11. Asha GowdaKaregowda, M. A. Jayaram, and A. S. Manjunath,Feature Subset Selection Problem using Wrapper Approach in Supervised Learning, 2010 International Journal of Computer Applications (09758887).
  12. Harold W. Kuhn, The Hungarian Method for the Assignment Problem, 2010.
  13. Hongrong Cheng, Zhiguang Qin, Chaosheng Feng, Yong Wang, and Fagen Li, Conditional Mutual Information-Based Feature SelectionAnalyzing for Synergy and Redundancy, ETRI Journal, Volume 33,Number 2, April 2011.
  14. Gavin Brown, Adam Pocock, Ming-Jie Zhao and Mikel Lujan, Conditional Likelihood Maximisation: A Unifying Framework for Infor-mation Theoretic Feature Selection, Journal of Machine Learning Research 13 (2012) 27-66 2012.
  15. VernicaBoln-Canedo, Noelia Snchez-Maroo and AmparoAlonsoBetanzos, A review of feature selection methods on synthetic data,DOI 10. 1007/s10115-012-0487-8 2013.
  16. Girish Chandrashekar, FeratSahin,A survey on feature selection methods, Computers and Electrical Engineering 40 (2014) 1628 2013.
  17. Cecille Freeman n, Dana Kuli, OtmanBasir, An Evaluation Of Classifier Specific Filter Measure Performance for Feature Selection,Pattern Recognition2014.
  18. https://archive. ics. uci. edu/ml/machine-learning-databases/
Index Terms

Computer Science
Information Sciences

Keywords

Mutual Information Feature Selection Classification Joint Mutual Information Parallel Computing.