CFP last date
20 March 2026
Call for Paper
April Edition
IJCA solicits high quality original research papers for the upcoming April edition of the journal. The last date of research paper submission is 20 March 2026

Submit your paper
Know more
Random Articles
Reseach Article

Dimensionality Reduction with Ensemble Learning using Growing Hierarchical Adaptive Self-Organizing Map

by Chiraz Jlassi, Ameni Filali, Najet Arous
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 187 - Number 82
Year of Publication: 2026
Authors: Chiraz Jlassi, Ameni Filali, Najet Arous
10.5120/ijca2026926428

Chiraz Jlassi, Ameni Filali, Najet Arous . Dimensionality Reduction with Ensemble Learning using Growing Hierarchical Adaptive Self-Organizing Map. International Journal of Computer Applications. 187, 82 ( Feb 2026), 17-23. DOI=10.5120/ijca2026926428

@article{ 10.5120/ijca2026926428,
author = { Chiraz Jlassi, Ameni Filali, Najet Arous },
title = { Dimensionality Reduction with Ensemble Learning using Growing Hierarchical Adaptive Self-Organizing Map },
journal = { International Journal of Computer Applications },
issue_date = { Feb 2026 },
volume = { 187 },
number = { 82 },
month = { Feb },
year = { 2026 },
issn = { 0975-8887 },
pages = { 17-23 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume187/number82/dimensionality-reduction-with-ensemble-learning-using-growing-hierarchical-adaptive-self-organizing-maprg/ },
doi = { 10.5120/ijca2026926428 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2026-02-21T01:28:09.179463+05:30
%A Chiraz Jlassi
%A Ameni Filali
%A Najet Arous
%T Dimensionality Reduction with Ensemble Learning using Growing Hierarchical Adaptive Self-Organizing Map
%J International Journal of Computer Applications
%@ 0975-8887
%V 187
%N 82
%P 17-23
%D 2026
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Feature selection is a technique designed to reduce the complexity of learning models by selecting the most relevant features, thereby enhancing the interpretability of the models while maintaining strong generalization performance. This study addresses the challenge of choosing a subset of the most important features for each cluster within a dataset. The proposed method extends the Random Forests approach by incorporating growing hierarchical adaptive self-organizing maps (GH_AdSOM) variant for unlabelled data. It assesses the out-of-bag feature importance across multiple partitions, each generated through various bootstrap samples and a random subset of features. The GH_AdSOM represents a neural network architecture that synergizes the benefits of two key enhancements to the self-organizing map: dynamic growth and hierarchical structure. This approach allows for adaptability in map size as well as a layered organization, resulting in a powerful and flexible neural network model.

References
  1. Kaur, S., Kumar, Y., Koul, A., and Kumar Kamboj, S. 2023. A systematic review on metaheuristic optimization techniques for feature selections in disease diagnosis, open issues and challenges. Archives of Computational Methods in Engineering, 30(3), pp 1863-1895.
  2. Fan, Y., Liu, J., Tang, J., Liu, P., Lin, Y., and Du, Y. 2024. Learning correlation information for multi-label feature selection. Pattern Recognition, 145, 109899.
  3. Strehl, A., and Ghosh, J. 2002. Cluster ensembles-a knowledge reuse framework for combining multiple partitions. Journal of Machine Learning.
  4. Dong, X., Yu, Z., Cao, W., Shi, Y., and Ma, Q. 2020. A survey on ensemble learning. Frontiers of Computer Science, 14, 241-258.
  5. Deng, S., Zhu, Y., Yu, Y., and Huang, X. 2024. An integrated approach of ensemble learning methods for stock index prediction using investor sentiments. Expert Systems with Applications, 238, 121710.
  6. Nie, F., Huang, H., Cai, X., and Ding, C. 2010. Efficient and robust feature selection via joint l2,1-norms minimization, in Proc. Adv. NIPS, pp. 1813–1821.
  7. Xu, Z., King, I., Lyu, M. R.-T and. Jin, R. 2010. Discriminative semisupervised feature selection via manifold regularization, IEEE Trans. Neural Netw.21(7) 1033–1047.
  8. Li, Z., Yang, Y., Liu, J., Zhou, X. and Lu, H. 2012. Unsupervised feature selection using nonnegative spectral analysis, inProc. Conf. AAAI, pp. 1026–1032.
  9. Fred, A. and Jain, A. 2008. Combining multiple clusterings using evidence accumulation, IEEE Trans. Pattern Anal. Mach. Intell.27(6) pp 835–850.
  10. Li, F., and Yang, Y. 2005. Analysis of recursive feature elimination methods. In Proceedings of the 28th annual international ACM SIGIR conference on Research and development in information retrieval pp. 633-634.
  11. Nogales, R.E., Benalcázar, M.E. 2023. Analysis and Evaluation of Feature Selection and Feature Extraction Methods. Int J Comput Intell Syst 16, 153.
  12. Deschênes, T., Tohoundjona, F. W. E., Plante, P. L., Di Marzo, V., and Raymond, F. 2023. Gene-based microbiome representation enhances host phenotype classification. Msystems, 8(4), e00531-23.
  13. Bommert, A., Sun, X., Bischl, B., Rahnenführer, J., and Lang, M. 2020. Benchmark for filter methods for feature selection in high-dimensional classification data. Computational Statistics & Data Analysis, 143, 106839.
  14. Liu, H., and Setiono, R. 2022. Feature selection and classification–a probabilistic wrapper approach. In Industrial and engineering applications or artificial intelligence and expert systems (pp. 419-424). CRC Press.
  15. Dudoit, S. and Fridlyand, J. 2003. Bagging to improve the accuracy of a clustering procedure, Bioinformatics, Vol. 19, No. 9, pp.1090–1099.
  16. Breiman, L. 2001. Random forests, Mach. Learn.45(1) 5–32.
  17. Guyon, I. and Elissee, A. 2003. An introduction to variable and feature selection (kernelmachines section). J. Mach. Learn. Res.31157–1182.
  18. Mundra, P. A., and Rajapakse, J. C. (2009). SVM-RFE with MRMR filter for gene selection. IEEE transactions on nanobioscience, 9(1), 31-37.
  19. Nakao, H., Imaoka, M., Hida, M., Imai, R., Nakamura, M., Matsumoto, K., and Kita, K. 2023. Determination of individual factors associated with hallux valgus using SVM-RFE. BMC Musculoskeletal Disorders, 24(1), 534.
  20. Dittenbach, M., Merkl, D., Rauber, A. 2000. The growing hierarchical self-organizing map, in proceedings of the International Joint Conference on Neural Networks (IJCNN), Vol. 6, pp 15-19.
  21. Kohonen, T. 2001. Self-organizing Maps, third edition, Springer.
  22. Meila, M. 2005. Comparing clusterings: An axiomatic view, inProc. 22nd Int. Conf. Machine Learning (ICML), Bonn, Germany, pp. 577–584.
  23. Filali, A., Jlassi, C., and Arous, N. 2017. Recursive Feature Elimination with Ensemble Learning Using SOM Variants. International Journal of Computational Intelligence and Applications (IJCIA), Vol. 16, No. 01, 1750004.
Index Terms

Computer Science
Information Sciences

Keywords

Growing hierarchical adaptive self-organizing map; random forest; feature selection; recursive feature elimination