CFP last date
22 April 2024
Call for Paper
May Edition
IJCA solicits high quality original research papers for the upcoming May edition of the journal. The last date of research paper submission is 22 April 2024

Submit your paper
Know more
Reseach Article

Geometric-inspired Particle Swarm Optimization (PSO) for Classification Tasks

by Enoch Opanin Gyamfi, Zhiguang Qin, Juliana Mantebea Danso, Daniel Adu-Gyamfi, Nelson Opoku-Mensah, Noble Arden Elorm Kuadey, Daniel Konin
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 183 - Number 53
Year of Publication: 2022
Authors: Enoch Opanin Gyamfi, Zhiguang Qin, Juliana Mantebea Danso, Daniel Adu-Gyamfi, Nelson Opoku-Mensah, Noble Arden Elorm Kuadey, Daniel Konin
10.5120/ijca2022921954

Enoch Opanin Gyamfi, Zhiguang Qin, Juliana Mantebea Danso, Daniel Adu-Gyamfi, Nelson Opoku-Mensah, Noble Arden Elorm Kuadey, Daniel Konin . Geometric-inspired Particle Swarm Optimization (PSO) for Classification Tasks. International Journal of Computer Applications. 183, 53 ( Feb 2022), 32-40. DOI=10.5120/ijca2022921954

@article{ 10.5120/ijca2022921954,
author = { Enoch Opanin Gyamfi, Zhiguang Qin, Juliana Mantebea Danso, Daniel Adu-Gyamfi, Nelson Opoku-Mensah, Noble Arden Elorm Kuadey, Daniel Konin },
title = { Geometric-inspired Particle Swarm Optimization (PSO) for Classification Tasks },
journal = { International Journal of Computer Applications },
issue_date = { Feb 2022 },
volume = { 183 },
number = { 53 },
month = { Feb },
year = { 2022 },
issn = { 0975-8887 },
pages = { 32-40 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume183/number53/32292-2022921954/ },
doi = { 10.5120/ijca2022921954 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T01:19:51.284807+05:30
%A Enoch Opanin Gyamfi
%A Zhiguang Qin
%A Juliana Mantebea Danso
%A Daniel Adu-Gyamfi
%A Nelson Opoku-Mensah
%A Noble Arden Elorm Kuadey
%A Daniel Konin
%T Geometric-inspired Particle Swarm Optimization (PSO) for Classification Tasks
%J International Journal of Computer Applications
%@ 0975-8887
%V 183
%N 53
%P 32-40
%D 2022
%I Foundation of Computer Science (FCS), NY, USA
Abstract

The ultimate performance of particle swarm optimization is influenced by hyper-parameters like the inertia, cognitive and social coefficient values. These hyper-parameters have a significant effect on search capability of the particle swarm optimization. When looking at previous studies that are carried out to calculate these coefficients, none of these studies has been inspired by geometric techniques to illustrate for the influence of these components on best position realization. In this, article a geometric approach to how the allocation of social, cognitive and inertia regions on a search space enables particles to move to their best positions at every iteration time. In experiment and benchmark tests, the study validates the applicability of the proposed approach to classification problem using EMNIST dataset. The modified PSO approach gives successful results in separating data into appropriate classes which confirms that the proposed method is highly competitive in guiding the directional movement of the particles towards the best positions.

References
  1. Tang, J., Alelyani, S., & Liu, H. (2014). Feature selection for classification: A review. Data classification: Algorithms and applications, 37-64, Chapman & Hall/CRC, Boca Raton, FL. ISBN:978-1-4665-8674-1.
  2. Reddy, G. T., Reddy, M. P. K., Lakshmanna, K., et. al. (2020). Analysis of dimensionality reduction techniques on big data. IEEE Access, 8, 54776-54788.
  3. Sugiyama, M. (2007). Dimensionality reduction of multimodal labeled data by local fisher discriminant analysis. Journal of machine learning research, 8(5).
  4. Song, J., Dauphin, Y., Auli, M., & Ma, T. (2020). Robust and on-the-fly dataset denoising for image classification. In European Conference on Computer Vision (pp. 556-572). Springer, Cham.
  5. Sharma, P., & Kaur, M. (2013). Classification in pattern recognition: A review. International Journal of Advanced Research in Computer Science and Software Engineering, 3(4).
  6. Singh, B., Dubey, V., &Sheetlani, J. (2018). A review and analysis on knowledge discovery and data mining techniques. International Journal of Advanced Technology and Engineering Exploration, 5(41), 70-77.
  7. Kalanat, N. (2021). An overview of actionable knowledge discovery techniques. Journal of Intelligent Information Systems, 1-21.
  8. Masaeli, M., Dy, J. G. & Fung, G. (2010), From transformation-based dimensionality reduction to feature selection. In Proceedings of the 27th International Conference on Machine Learning (ICML), pages 751–758.
  9. Omuya, E. O., Okeyo, G. O., &Kimwele, M. W. (2021). Feature selection for classification using principal component analysis and information gain. Expert Systems with Applications, 174, 114765.
  10. Urbanowicz, R. J., Meeker, M., La Cava, W., Olson, R. S., & Moore, J. H. (2018). Relief-based feature selection: Introduction and review. Journal of biomedical informatics, 85, 189-203.
  11. Ghosh, P., Azam, S., Jonkman, M., et. al. (2021). Efficient Prediction of Cardiovascular Disease Using Machine Learning Algorithms with Relief and LASSO Feature Selection Techniques. IEEE Access, 9, 19304-19326.
  12. Aksu, D., Üstebay, S., Aydin, M. A., &Atmaca, T. (2018). Intrusion detection with comparative analysis of supervised learning techniques and fisher score feature selection algorithm. In International symposium on computer and information sciences (pp. 141-149). Springer, Cham.
  13. Saqlain, S. M., Sher, M., Shah, F. A., Khan, I., Ashraf, M. U., Awais, M., & Ghani, A. (2019). Fisher score and Matthews correlation coefficient-based feature subset selection for heart disease diagnosis using support vector machines. Knowledge and Information Systems, 58(1), 139-167.
  14. Lei, C., & Zhu, X. (2018). Unsupervised feature selection via local structure learning and sparse learning. Multimedia Tools and Applications, 77(22), 29605-29622.
  15. Zhao, T., Liu, H., & Zhang, T. (2018). Pathwise coordinate optimization for sparse learning: Algorithm and theory. The Annals of Statistics, 46(1), 180-218.
  16. Ghosh, J., &Shuvo, S. B. (2019). Improving Classification Model's Performance Using Linear Discriminant Analysis on Linear Data. In 2019 10th International Conference on Computing, Communication and Networking Technologies (ICCCNT) (pp. 1-5). IEEE.
  17. Guo, Y., Ding, X., Liu, C., &Xue, J. H. (2016). Sufficient canonical correlation analysis. IEEE Transactions on Image Processing, 25(6), 2610-2619.
  18. Agis, D., &Pozo, F. (2019). A frequency-based approach for the detection and classification of structural changes using t-SNE. Sensors, 19(23), 5097.
  19. Xu, X., Xie, Z., Yang, Z., Li, D., & Xu, X. (2020). A t-SNE based classification approach to compositional microbiome data. Frontiers in Genetics, 11, 1633.
  20. Eberhart, R., & Kennedy, J. (1995). Particle swarm optimization. In Proceedings of the IEEE international conference on neural networks, . 4, 1942-1948, IEEE Service Center, Piscataway, New Jersey
  21. Eberhart R. C. & Shi Y. (2001). Tracking and optimizing dynamic systems with particle swarms, In Evolutionary Computation. Proceedings of the 2001 Congress. 1, 94–100, IEEE.
  22. Liang, J. J., Qin, A. K., Suganthan, P. N., &Baskar, S. (2006). Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE transactions on evolutionary computation, 10(3), 281-295.
  23. Wang, D., Tan, D., & Liu, L. (2018). Particle swarm optimization algorithm: an overview. Soft Computing, 22(2), 387-408.
  24. Gandomi, A. H., Yang, X. S., Talatahari, S., &Alavi, A. H. (Eds.). (2013). Metaheuristic applications in structures and infrastructures. Newnes.
  25. Sun, S., & Liu, H. (2013). Particle swarm algorithm: convergence and applications. In Swarm Intelligence and Bio-Inspired Computation (pp. 137-168). Elsevier.
  26. Tran, B., Xue, B., & Zhang, M. (2014). Overview of particle swarm optimisation for feature selection in classification. In Asia-Pacific conference on simulated evolution and learning (pp. 605-617). Springer, Cham.
  27. Yang, C. S., Chuang, L. Y., Ke, C. H., & Yang, C. H. (2008). Boolean binary particle swarm optimization for feature selection. In 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence) (pp. 2093-2098). IEEE.
  28. Sousa, T., Neves, A., & Silva, A. (2003). Swarm optimisation as a new tool for data mining. In Proceedings international parallel and distributed processing symposium (pp. 6-pp). IEEE.
  29. Sousa, T., Silva, A., &Neves, A. (2003). A particle swarm data miner. In Portuguese Conference on Artificial Intelligence (pp. 43-53). Springer, Berlin, Heidelberg.
  30. Sousa, T., Silva, A., &Neves, A. (2004). Particle swarm based data mining algorithms for classification tasks. Parallel computing, 30(5-6), 767-783.
  31. Cervantes, A., Galván, I. M., &Isasi, P. (2005). Binary particle swarm optimization in classification, Neural Network World, Volume 15, Issue 3, Pages 229 – 241.
  32. Zahiri S. H., &Seyedin S.A. (2005). Intelligent particle swarm classifier, Iranian Journal of Electrical and Computer Engineering, Volume 4, Issue 1, Pages 63–70
  33. De Falco, I., Della Cioppa, A., & Tarantino, E. (2005). Evaluation of particle swarm optimization effectiveness in classification. In International Workshop on Fuzzy Logic and Applications (pp. 164-171). Springer, Berlin, Heidelberg.
  34. Van der Merwe, D. W., &Engelbrecht, A. P. (2003). Data clustering using particle swarm optimization. In The Congress on Evolutionary Computation, 2003. CEC'03. (Vol. 1, pp. 215-220). IEEE
  35. Omran, M., Engelbrecht, A. P., & Salman, A. (2005). Particle swarm optimization method for image clustering. International Journal of Pattern Recognition and Artificial Intelligence, 19(03), 297-321.
  36. Gao L., Gao H., Zhou C., & Yu D. (2004) Acquisition of pattern classification rule based on particle swarm optimization, HuazhongKejiDaxueXuebao (ZiranKexue Ban)/Journal of Huazhong University of Science and Technology (Natural Science Edition), Volume 32, Issue 11, Pages 24 - 26November 2004.
  37. Kuo, R. J., Chao, C. M., & Chiu, Y. T. (2011). Application of particle swarm optimization to association rule mining. Applied soft computing, 11(1), 326-336.
  38. Sarath, K. N. V. D., & Ravi, V. (2013). Association rule mining using binary particle swarm optimization. Engineering Applications of Artificial Intelligence, 26(8), 1832-1840.
  39. Ab Wahab, M. N., Nefti-Meziani, S., &Atyabi, A. (2015). A comprehensive review of swarm optimization algorithms. PloS one, 10(5), e0122827.
  40. Chang Y & Yu G (2013). Multi-Sub-Swarm PSO Classifier Design and Rule Extraction. Int. Work. Cloud Computing Information Security. 2013: 104–107.
  41. Cohen, G., Afshar, S., Tapson, J., & van Schaik, A. (2017). EMNIST: an extension of MNIST to handwritten letters.In 2017 International Joint Conference on Neural Networks (IJCNN) 2921-2926). IEEE, arXiv preprint arXiv:1702.05373.
Index Terms

Computer Science
Information Sciences

Keywords

Particle swarm optimization geometric classification sub-swarm