CFP last date
20 October 2025
Call for Paper
November Edition
IJCA solicits high quality original research papers for the upcoming November edition of the journal. The last date of research paper submission is 20 October 2025

Submit your paper
Know more
Random Articles
Reseach Article

Emotionally Intelligent Chatbots in Mental Health: A Review of Psychological, Ethical, and Developmental Impacts

by Ruwini Herath
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 187 - Number 29
Year of Publication: 2025
Authors: Ruwini Herath
10.5120/ijca2025925507

Ruwini Herath . Emotionally Intelligent Chatbots in Mental Health: A Review of Psychological, Ethical, and Developmental Impacts. International Journal of Computer Applications. 187, 29 ( Aug 2025), 49-56. DOI=10.5120/ijca2025925507

@article{ 10.5120/ijca2025925507,
author = { Ruwini Herath },
title = { Emotionally Intelligent Chatbots in Mental Health: A Review of Psychological, Ethical, and Developmental Impacts },
journal = { International Journal of Computer Applications },
issue_date = { Aug 2025 },
volume = { 187 },
number = { 29 },
month = { Aug },
year = { 2025 },
issn = { 0975-8887 },
pages = { 49-56 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume187/number29/emotionally-intelligent-chatbots-in-mental-health-a-review-of-psychological-ethical-and-developmental-impacts/ },
doi = { 10.5120/ijca2025925507 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2025-08-20T21:35:10.094395+05:30
%A Ruwini Herath
%T Emotionally Intelligent Chatbots in Mental Health: A Review of Psychological, Ethical, and Developmental Impacts
%J International Journal of Computer Applications
%@ 0975-8887
%V 187
%N 29
%P 49-56
%D 2025
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Use of emotionally intelligent chatbots is increasing in mental health settings to provide support by recognizing and reacting to users’ emotions. This review has a closer look at 59 peer-reviewed studies from 2017 to 2024, with a focus on systems like Woebot and Wysa. It maps out how affective computing, psychological frameworks like cognitive behavioral therapy (CBT), and human-computer interaction theories shape these systems. While there is early evidence of benefits like reduced anxiety and better emotional self-awareness, many issues remain unresolved. These include weak long-term evidence, cultural bias in emotion recognition, and potential over-dependence on AI. We also highlight the risks of collecting and using emotional data without sufficient oversight. Based on this, we suggest future research should move toward multicultural, longer-term, and ethically grounded studies. The goal should be to create emotionally intelligent systems that support, not replace, genuine human connection, especially in vulnerable populations.

References
  1. N. Kallivalappil, K. D’souza, A. Deshmukh, C. Kadam, and N. Sharma, “Empath.ai: A context-aware chatbot for emotional detection and support,” in Proc. 14th Int. Conf. Comput. Commun. Netw. Technol. (ICCCNT), 2023, pp. 1–7. doi:10.1109/ICCCNT56998.2023.10306584.
  2. T. Spring, J. Casas, K. Daher, E. Mugellini, and O. A. Khaled, “Empathic response generation in chatbots,” CONVERSATIONS Workshop, Amsterdam, 2019. [Online]. Available: https://arxiv.org/abs/1911.12315
  3. S. Devaram, “Empathic Chatbot: Emotional Intelligence for Mental Health Well-being,” in IEEE ICAC3, Bournemouth University, UK, 2020. [Online]. Available: https://arxiv.org/abs/2012.09130
  4. S. B. Velagaleti, “Empathetic algorithms: The role of AI in understanding and enhancing human emotional intelligence,” J. Electr. Syst., vol. 20, no. 3s, pp. 2051–2060, 2024. doi:10.52783/jes.1806
  5. S. Zeb, N. FNU, N. Abbasi, and M. Fahad, “AI in Healthcare: Revolutionizing Diagnosis and Therapy,” Int. J. Multidiscip. Sci. Arts, vol. 3, no. 3, 2024. doi:10.47709/ijmdsa.v3i3.4546
  6. S. Tahir, S. A. Shah, and J. Abu-Khalaf, “Artificial Empathy Classification: A Survey,” arXiv preprint, arXiv:2310.00010, 2023. [Online]. Available: https://arxiv.org/abs/2310.00010
  7. S. Cao et al., “Pain recognition and pain empathy from a human-centered AI perspective,” iScience, vol. 27, no. 8, p. 110570, 2024. doi: 10.1016/j.isci.2024.110570
  8. M. Shvo and S. A. McIlraith, “Towards Empathetic Planning and Plan Recognition,” in Proc. AIES ’19, 2019, pp. 525–526. doi:10.1145/3306618.3314307
  9. G. Bilquise, S. Ibrahim, and K. Shaalan, “Emotionally intelligent chatbots: A systematic review,” Hum. Behav. Emerg. Technol., pp. 1–23, 2022. doi:10.1155/2022/9601630
  10. A. Ghandeharioun, D. McDuff, M. Czerwinski, and K. Rowan, “Towards understanding emotional intelligence for behavior change chatbots,” arXiv preprint, arXiv:1907.10664, 2019. doi:10.48550/arXiv.1907.10664
  11. M. Rostami and S. Navabinejad, “Artificial empathy: User experiences with emotionally intelligent chatbots,” AI & Tech. Behav. Soc. Sci., vol. 1, no. 3, pp. 19–27, 2023. doi:10.61838/kman.aitech.1.3.4
  12. P. Borele and D. A. Borikar, “An approach to sentiment analysis using artificial neural networks,” IOSR J. Comput. Eng., vol. 18, no. 2, pp. 64–69, 2016. doi:10.9790/0661-1802056469
  13. P. Chakriswaran et al., “Emotion AI-driven sentiment analysis,” Appl. Sci., vol. 9, no. 24, p. 5462, 2019. doi:10.3390/app9245462
  14. H. S. Yang et al., “AI chatbots in clinical laboratory medicine,” Clin. Chem., vol. 69, no. 11, pp. 1238–1246, 2023. doi:10.1093/clinchem/hvad106
  15. A. R. Mathew, A. Al Hajj, and A. Al Abri, “Human-computer interaction (HCI): An overview,” in IEEE Int. Conf. Comput. Sci. Autom. Eng., 2011, pp. 99–100. doi:10.1109/CSAE.2011.5953178
  16. B. Myers et al., “Strategic directions in human-computer interaction,” ACM Comput. Surv., vol. 28, no. 4, pp. 794–809, 1996. doi:10.1145/242223.246855
  17. R. J. Lee-Won, Y. K. Joo, and S. G. Park, “Media Equation,” Int. Encycl. Media Psychol., 2020. doi:10.1002/9781119011071.iemp0158
  18. C. Bartneck, C. Rosalia, R. Menges, and I. Deckers, “Robot abuse – A limitation of the media equation,” Eindhoven Univ. Technol., n.d. [Online]. Available: http://www.bartneck.de
  19. D. Johnson and J. Gardner, “The media equation and team formation,” Int. J. Hum.-Comput. Stud., vol. 65, no. 2, pp. 111–124, 2007. doi:10.1016/j.ijhcs.2006.08.007
  20. O. Gillath et al., “Attachment and trust in AI,” Comput. Hum. Behav., vol. 115, p. 106607, 2021. doi: 10.1016/j.chb.2020.106607
  21. T. Xie and I. Pentina, “Attachment theory for chatbot relationships: A case study of Replika,” in Proc. HICSS, 2022. doi:10.24251/HICSS.2022.258
  22. D. Petters and E. Waters, “AI, attachment theory, and secure base simulation,” AISB 2010 Convention, 2010.
  23. L. Kambeitz-Ilankovic et al., “Review of digital and face-to-face CBT for depression,” npj Digit. Med., vol. 5, p. 144, 2022. doi:10.1038/s41746-022-00677-8
  24. H. M. Jackson et al., “Skill enactment in digital CBT,” J. Med. Internet Res., vol. 25, p. e44673, 2023. doi:10.2196/44673
  25. G. R. Thew, A. Rozental, and H. D. Hadjistavropoulos, “Advances in digital CBT,” Cogn. Behav. Ther., vol. 15, p. e44, 2022. doi:10.1017/S1754470X22000423
  26. L. Lawlor-Savage and J. L. Prentice, “Digital CBT in Canada: Ethical considerations,” Can. Psychol., vol. 55, no. 4, pp. 231–239, 2014. doi:10.1037/a0037861
  27. M. Farzan et al., “AI-powered CBT chatbots: A review,” Iran. J. Psychiatry, 2024. doi:10.18502/ijps.v20i1.17395
  28. B. Maples et al., “GPT3-enabled chatbots and suicide prevention,” npj Ment. Health Res., vol. 3, p. 4, 2024. doi: 10.1038/s44184-023-00047-6
  29. E. Gabarron, D. Larbi, K. Denecke, and E. Årsand, “Chatbots in public health,” Stud. Health Technol. Inform., IOS Press, 2020.
  30. V. K. Voola et al., “AI chatbots in clinical trials,” Int. J. Res. Publ. Seminar, vol. 13, no. 5, pp. 323–337, 2022. doi:10.36676/jrps.v13.i5.1505
  31. L. T. Car et al., “Conversational agents in health care: Scoping review and conceptual analysis,” J. Med. Internet Res., vol. 22, no. 8, p. e17158, 2020. doi:10.2196/17158
  32. M. Laymouna et al., “Roles, users, benefits, and limitations of chatbots in health care: Rapid review (preprint),” 2024. doi: 10.2196/preprints.56930
  33. D. S. Parikh and H. Raval, “Limitations of existing chatbots: An analytical survey,” Int. J. Innov. Res. Sci. Eng. Technol., vol. 7, no. 2, 2020.
  34. V. S. Barletta et al., “Clinical-chatbot AHP evaluation based on ‘quality in use’ of ISO/IEC 25010,” Int. J. Med. Inform., vol. 170, p. 104951, 2023. doi:10.1016/j.ijmedinf.2022.104951
  35. H. S. Yang et al., “AI chatbots in clinical laboratory medicine: Foundations and trends,” Clin. Chem., vol. 69, no. 11, pp. 1238–1246, 2023. doi:10.1093/clinchem/hvad106
  36. E. Ortega-Ochoa et al., “The effectiveness of empathic chatbot feedback in online higher education,” Internet Things, vol. 25, p. 101101, 2024. doi:10.1016/j.iot.2024.101101
  37. M. Rostami and S. Navabinejad, “Artificial empathy: User experiences with emotionally intelligent chatbots,” AI Tech Behav. Soc. Sci., vol. 1, no. 3, pp. 19–27, 2023. doi:10.61838/kman.aitech.1.3.4
  38. R. Indellicato, “Artificial intelligence and social-emotional learning: What relationship?” J. Mod. Sci., vol. 60, no. 6, pp. 460–470, 2024. doi:10.13166/jms/196765
  39. S. S. Sethi and K. Jain, “AI technologies for social-emotional learning,” J. Res. Innov. Teach. Learn., vol. 17, no. 2, pp. 213–225, 2024. doi:10.1108/JRIT-03-2024-0073
  40. M. I. Gómez-León, “Development of empathy through socioemotional AI,” Papeles del Psicólogo, vol. 43, no. 3, p. 218, 2022. doi:10.23923/pap.psicol.2996
  41. K. Heljakka, P. Ihamäki, and A. I. Lamminen, “Empathic responses to robot dogs vs. real dogs in learning,” in Proc. CHI PLAY '20, pp. 262–266, 2020. doi:10.1145/3383668.3419900
  42. C. Akbulut et al., “All too human? Mapping and mitigating risks from anthropomorphic AI,” AIES Conf., vol. 7, pp. 13–26, 2024. doi:10.1609/aies.v7i1.31613
  43. C. Montemayor, J. Halpern, and A. Fairweather, “In principle, there are obstacles for empathic AI in healthcare,” AI & Society, vol. 37, no. 4, pp. 1353–1359, 2022. doi:10.1007/s00146-021-01230-z
  44. M. Rubin, H. Arnon, J. D. Huppert, and A. Perry, “Considering human empathy in AI-driven therapy (preprint),” 2024. doi:10.2196/preprints.56529
  45. R. Agrawal and N. Pandey, “Developing rapport with emotionally intelligent AI assistants,” Int. J. Res. Appl. Sci. Eng. Technol., vol. 12, no. 3, pp. 1473–1480, 2024. doi: 10.22214/ijraset.2024.59015
  46. A. McStay, “Emotional AI and privacy,” Big Data & Society, vol. 7, no. 1, p. 205395172090438, 2020. doi:10.1177/2053951720904386
  47. K. Roemmich, F. Schaub, and N. Andalibi, “Emotion AI at work,” in Proc. CHI Conf. Hum. Factors Comput. Syst., pp. 1–20, 2023. doi:10.1145/3544548.3580950
  48. E. Sedenberg and J. Chuang, “Smile for the camera: Privacy implications of emotion AI,” UC Berkeley School of Information, n.d. [Online]. Available: https://www.ischool.berkeley.edu/research/publications/smile-camera-privacy-and-policy-implications-emotion-ai
  49. L. Rhue, “Racial influence on automated perceptions of emotions,” SSRN Electron. J., 2018. doi:10.2139/ssrn.3281765
  50. M. Yoshie and D. A. Sauter, “Cultural norms in nonverbal emotion expression,” Emotion, vol. 20, no. 3, pp. 513–517, 2020. doi:10.1037/emo0000580
  51. M. Mattioli and F. Cabitza, “Ethics in automatic face emotion recognition,” Mach. Learn. Knowl. Extr., vol. 6, pp. 2201–2231, 2024. doi: 10.3390/make6040109
  52. M. Nagata and K. Okajima, “Observer culture and facial expression recognition,” PLoS ONE, vol. 19, no. 10, p. e0313029, 2024. doi:10.1371/journal.pone.0313029
  53. I. Dominguez-Catena, D. Paternain, and M. Galar, “Metrics for dataset demographic bias in facial expression recognition,” arXiv preprint, arXiv:2303.15889, 2024. [Online]. Available: https://arxiv.org/abs/2303.15889
  54. G. Benitez-Garcia, T. Nakamura, and M. Kaneko, “Facial expression recognition with Fourier descriptors,” J. Signal Inf. Process., vol. 8, no. 3, 2017. doi:10.4236/jsip.2017.83009
  55. R. Pusztahelyi and I. Stefán, “Social robots and data protection,” Acta Univ. Sapientiae, Legal Studies, vol. 11, no. 1, pp. 95–118, 2022. doi:10.47745/AUSLEG.2022.11.1.06
  56. E. Schwitzgebel, “AI systems must not mislead about sentience,” Patterns, vol. 4, no. 8, p. 100818, 2023. doi:10.1016/j.patter.2023.100818
  57. M. S. Farahani and G. Ghasemi, “Artificial intelligence and inequality,” Qeios, 2024. doi:10.32388/7HWUZ2
  58. A. Hagerty and I. Rubinov, “Global AI ethics: Review of social impacts,” arXiv preprint, arXiv:1907.07892, 2019. [Online]. Available: https://arxiv.org/abs/1907.07892.
  59. P. Choudhury, R. T. Allen, and M. G. Endres, “ML for pattern discovery in management research,” Strat. Manag. J., vol. 42, no. 1, pp. 30–57, 2021. doi:10.1002/smj.3215.
Index Terms

Computer Science
Information Sciences

Keywords

Empathic AI Affective computing Mental-health chatbots Artificial empathy Human–computer interaction Emotion recognition