CFP last date
20 April 2026
Call for Paper
May Edition
IJCA solicits high quality original research papers for the upcoming May edition of the journal. The last date of research paper submission is 20 April 2026

Submit your paper
Know more
Random Articles
Reseach Article

An Integrated Interpretable Performance Prediction Model with Dynamically Optimized Attributes

by Meenakshi Devi, Rakesh Kumar
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 187 - Number 88
Year of Publication: 2026
Authors: Meenakshi Devi, Rakesh Kumar
10.5120/ijca2026926521

Meenakshi Devi, Rakesh Kumar . An Integrated Interpretable Performance Prediction Model with Dynamically Optimized Attributes. International Journal of Computer Applications. 187, 88 ( Mar 2026), 1-8. DOI=10.5120/ijca2026926521

@article{ 10.5120/ijca2026926521,
author = { Meenakshi Devi, Rakesh Kumar },
title = { An Integrated Interpretable Performance Prediction Model with Dynamically Optimized Attributes },
journal = { International Journal of Computer Applications },
issue_date = { Mar 2026 },
volume = { 187 },
number = { 88 },
month = { Mar },
year = { 2026 },
issn = { 0975-8887 },
pages = { 1-8 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume187/number88/an-integrated-interpretable-performance-prediction-model-with-dynamically-optimized-attributes/ },
doi = { 10.5120/ijca2026926521 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2026-03-20T22:55:20.773769+05:30
%A Meenakshi Devi
%A Rakesh Kumar
%T An Integrated Interpretable Performance Prediction Model with Dynamically Optimized Attributes
%J International Journal of Computer Applications
%@ 0975-8887
%V 187
%N 88
%P 1-8
%D 2026
%I Foundation of Computer Science (FCS), NY, USA
Abstract

The increasing availability of educational data has opened new possibilities for applying machine learning. It helps improve student learning outcomes. This paper examines the application of an integrated machine learning approach to an educational dataset. It keeps the focus on key modeling attributes that influence student performance. The emphasis is on selecting only meaningful attributes to reduce irrelevant information. To ensure stable, reliable modeling, the model validation is incorporated in the integrated approach. The model evaluation using cross-validation establishes confidence in the results' reliability. An optimization technique follows cross-validation. It searches the relevant attributes, thereby simplifying the model and making it easier to interpret. In addition to evaluating predictive performance metrics, model interpretation is included for the selected attributes. The interpretation analysis enables a better understanding of the relevant attributes across different models. SHAP is used to illustrate the contribution of individual attributes to prediction outcomes. The findings report that combining validation, optimization, and interpretability enhances the model performance for the educational data.

References
  1. J.-M. Trujillo-Torres, H. Hossein-Mohand, M. Gómez-García, H. Hossein-Mohand, and F.-J. Hinojo-Lucena, “Estimating the academic performance of secondary education mathematics students: A gain lift predictive model,” Mathematics, vol. 8, no. 12, p. 2101, 2020.
  2. Y. Zhang, Y. Yun, R. An, J. Cui, H. Dai, and X. Shang, “Educational data mining techniques for student performance prediction: method review and comparison analysis,” Frontiers in psychology, vol. 12, p. 698490, 2021.
  3. A. S. Pinto, A. Abreu, E. Costa, and J. Paiva, “How machine learning (ml) is transforming higher education: A systematic literature review,” Journal of Information Systems Engineering and Management, vol. 8, no. 2, 2023.
  4. S. Hussain and M. Q. Khan, “Student-performulator: Predicting students academic performance at secondary and intermediate level using machine learning,” Annals of data science, vol. 10, no. 3, pp. 637–655, 2023.
  5. N. R. Yadav and S. S. Deshmukh, “Prediction of student performance using machine learning techniques: A review,” in International Conference on Applications of Machine Intelligence and Data Analytics (ICAMIDA 2022). Atlantis Press, 2023, pp. 735–741.
  6. R. Umer, T. Susnjak, A. Mathrani, and L. Suriadi, “Current stance on predictive analytics in higher education: opportunities, challenges and future directions,” Interactive Learning Environments, pp. 1–26, 2021.
  7. C. Herodotou, B. Rienties, A. Boroowa, Z. Zdrahal, and M. Hlosta, “A large-scale implementation of predictive learning analytics in higher education: the teachers role and perspective,” Educational Technology Research and Development, vol. 67, no. 5, pp. 1273–1306, 2019.
  8. K. A. Bird, B. L. Castleman, Z. Mabel, and Y. Song, “Bringing transparency to predictive analytics: A systematic comparison of predictive modeling methods in higher education,” AERA Open, vol. 7, p. 23328584211037630, 2021.
  9. M. Adnan, A. Habib, J. Ashraf, S. Mussadiq, A. A. Raza, M. Abid, M. Bashir, and S. U. Khan, “Predicting at-risk students at different percentages of course length for early intervention using machine learning models,” Ieee Access, vol. 9, pp. 7519–7539, 2021.
  10. J. Kabathova and M. Drlik, “Towards predicting students dropout in university courses using different machine learning techniques,” Applied Sciences, vol. 11, no. 7, p. 3130, 2021.
  11. K. Alalawi, R. Athauda, and R. Chiong, “Contextualizing the current state of research on the use of machine learning for student performance prediction: A systematic literature review,” Engineering Reports, vol. 5, no. 12, p. e12699, 2023.
  12. K. Alalawi, R. Athauda, and R. Chiong, “An extended learning analytics framework integrating machine learning and pedagogical approaches for student performance prediction and intervention,” International Journal of Artificial Intelligence in Education, pp. 1–49, 2024.
  13. P. Cortez, “Student Performance,” UCI Machine Learning Repository, 2014, DOI: https://doi.org/10.24432/C5TG7T.
  14. S. Helal, J. Li, L. Liu, E. Ebrahimie, S. Dawson, D. J. Murray, and Q. Long, “Predicting academic performance by considering student heterogeneity,” Knowledge-Based Systems, vol. 161, pp. 134–146, 2018.
  15. S. Rai, K. A. Shastry, S. Pratap, S. Kishore, P. Mishra, and H. Sanjay, “Machine learning approach for student academic performance prediction,” in Evolution in Computational Intelligence: Frontiers in Intelligent Computing: Theory and Applications (FICTA 2020), Volume 1. Springer, 2021, pp. 611–618.
  16. D.Wang, D. Lian, Y. Xing, S. Dong, X. Sun, and J. Yu, “Analysis and prediction of influencing factors of college student achievement based on machine learning,” Frontiers in Psychology, vol. 13, p. 881859, 2022.
  17. H. Pallathadka, A. Wenda, E. Ramirez-Asís, M. Asís-López, J. Flores-Albornoz, and K. Phasinam, “Classification and prediction of student performance data using various machine learning algorithms,” Materials today: proceedings, vol. 80, pp. 3782–3785, 2023.
  18. P. Houngue, M. Hountondji, and T. Dagba, “An effective decision-making support for student academic path selection using machine learning,” International Journal of Advanced Computer Science and Applications, vol. 13, no. 11, 2022.
  19. S. Fadili, M. Ertel, A. Mengad, and S. Amali, “Predicting optimal learning approaches for nursing students in morocco.” International Journal of Advanced Computer Science & Applications, vol. 15, no. 4, 2024.
  20. J. H. Holland, “Genetic algorithms,” Scientific american, vol. 267, no. 1, pp. 66–73, 1992.
  21. D. Beasley, D. R. Bull, and R. R. Martin, “An overview of genetic algorithms: Part 2, research topics,” University computing, vol. 15, no. 4, pp. 170–181, 1993.
Index Terms

Computer Science
Information Sciences

Keywords

Classification Cross-Validation Explainable AI Machine Learning SHAP Student Performance Prediction