Call for Paper - January 2024 Edition
IJCA solicits original research papers for the January 2024 Edition. Last date of manuscript submission is December 20, 2023. Read More

Feature Selection using Modified Particle Swarm Optimization

International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Year of Publication: 2017
Khushboo Jain, Anuradha Purohit

Khushboo Jain and Anuradha Purohit. Feature Selection using Modified Particle Swarm Optimization. International Journal of Computer Applications 161(7):8-12, March 2017. BibTeX

	author = {Khushboo Jain and Anuradha Purohit},
	title = {Feature Selection using Modified Particle Swarm Optimization},
	journal = {International Journal of Computer Applications},
	issue_date = {March 2017},
	volume = {161},
	number = {7},
	month = {Mar},
	year = {2017},
	issn = {0975-8887},
	pages = {8-12},
	numpages = {5},
	url = {},
	doi = {10.5120/ijca2017913229},
	publisher = {Foundation of Computer Science (FCS), NY, USA},
	address = {New York, USA}


Feature selection is the process by which relevant features are selected from large datasets in order to improve the performance of the classification systems. There are various approaches that are used for feature selection such as Soft Computing, Hill Climbing etc. Particle Swarm Optimization is now a days popularly used soft computing technique for feature selection due to its searching ability, simplicity and low computation cost. But the main problem with Particle Swarm Optimization is premature convergence which in turn affects the classification performance. In this paper, a modified Particle Swarm Optimization is proposed for feature selection. To handle the problem of premature convergence, a flipping operator is introduced before the updation of velocity and position of the particle. Fitness of each particle is computed using Support Vector Machine based fitness function. To establish the effectiveness of proposed approach, testing is done on various benchmark datasets like wine, zoo, sonar etc. Results obtained on these datasets are compared with the standard approach and satisfactory improvements are observed.


  1. M. Dash and H. Liu, “Feature selection for classification,” Intelligent Data Analysis, vol. 1, no. 1–4, pp. 131–156, 1997.
  2. A. Unler and A. Murat, “A discrete particle swarm optimization method for feature selection in binary classification problems,” Science Direct Trancastion on European Journals of Operational Research, vol. 206, no. 3, pp. 528–539, Nov. 2010.
  3. Xiaodong zhu, Yoanning Liu, Gang Yang , Hao Dong, Sujing Wang, Huiliag Chen, “An Improved particle swarm optimization for Feature Selection,” Science Direct Transaction on Bionic Engg., vol. 8, issue 2, pp. 191-200, June 2011.
  4. Isabelle Gyuon, Andre Elisseeff, “An Introduction to variable and Feature Selection,” Journal of Machine Learning Research, pp. 1157-1182, March 2003.
  5. Jiliang tang, Saleem Alelyani and Huan Liu, “ Feature selection for classification: A Review,” Thesis, pp. 1-29, 2014.
  6. Stanislaw Osowski, Robert Siroi, Tomasz Markiewicz, and Krzysztof Siwek, "Application of Support Vector Machine and Genetic Algorithm for Improved Blood Cell Recognition,” IEEE transaction instrumentation and measurement, vol. 58, no. 7, pp. 2159-2166, July 2009.
  7. Bing Xue, Mengjie Zhang, Will N.Browne, “Particle swarm optimization for feature selection in classification: Novel initalisation and updating mechanism,” Applied Soft Computing, pp. 261-276, New Zealand, May 2014.
  8. Chung-Jui TuSSS, “Feature Selection using PSO-SVM,” IAENG International Journal of Computer Science, Feb. 2007.
  9. J. Kennedy and R. C. Eberhart, "Particle swarm optimization” In Proceedings of the 1995 IEEE International Conference on Neural Networks, volume 4, pages 1942–1948, IEEE Press, Piscataway, NJ, 1995.
  10. Vapnik V., “ The nature of statistical learning theory," Statistics of engg. and Tnformation science, Springer-Verlag, New York NY, 1995.


Feature Selection, Particle Swarm Optimization, Classification, Support Vector Machine.