CFP last date
20 May 2024
Call for Paper
June Edition
IJCA solicits high quality original research papers for the upcoming June edition of the journal. The last date of research paper submission is 20 May 2024

Submit your paper
Know more
Reseach Article

Facial Emotion Recognition and Eye-tracking based Expressive Communication Framework: Review and Recommendations

by Pradeep Kumar Kaushik, Shivam Pandey, Sushil Singh Rauthan
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 184 - Number 39
Year of Publication: 2022
Authors: Pradeep Kumar Kaushik, Shivam Pandey, Sushil Singh Rauthan
10.5120/ijca2022922494

Pradeep Kumar Kaushik, Shivam Pandey, Sushil Singh Rauthan . Facial Emotion Recognition and Eye-tracking based Expressive Communication Framework: Review and Recommendations. International Journal of Computer Applications. 184, 39 ( Dec 2022), 20-28. DOI=10.5120/ijca2022922494

@article{ 10.5120/ijca2022922494,
author = { Pradeep Kumar Kaushik, Shivam Pandey, Sushil Singh Rauthan },
title = { Facial Emotion Recognition and Eye-tracking based Expressive Communication Framework: Review and Recommendations },
journal = { International Journal of Computer Applications },
issue_date = { Dec 2022 },
volume = { 184 },
number = { 39 },
month = { Dec },
year = { 2022 },
issn = { 0975-8887 },
pages = { 20-28 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume184/number39/32571-2022922494/ },
doi = { 10.5120/ijca2022922494 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T01:23:34.532200+05:30
%A Pradeep Kumar Kaushik
%A Shivam Pandey
%A Sushil Singh Rauthan
%T Facial Emotion Recognition and Eye-tracking based Expressive Communication Framework: Review and Recommendations
%J International Journal of Computer Applications
%@ 0975-8887
%V 184
%N 39
%P 20-28
%D 2022
%I Foundation of Computer Science (FCS), NY, USA
Abstract

This paper presents a review of the application of facial emotion recognition and eye tracking technology for mentally fit intensive speech, disabled persons. It also presents a review of the augmentative and alternative communication (AAC), already developed emotion recognition systems or frameworks and some current speech disability support devices selected from the internet. Employing these techniques, the design of simplex-based automatic real-time expressive communication system to facilitate effective interpretation of thoughts of speech-disabled is advocated in this paper. While several devices are available, some with eye-tracking capability, but there is still a need for more versatile, cost-effective device or framework and the challenge is to achieve speech like a human from AAC based device.

References
  1. “World Health Organization-Health topics-Disabilities” [Online] Available: https://www.who.int/topics/disabilities/en/
  2. “International Classification of Functioning, Disability and Health” [Online] Available: https://www.who.int/classifications/icf/en/
  3. “World report on disability” [Online] Available: https://en.wikipedia.org/wiki/World_report_on_disability
  4. Surbhi Rathi, Ujwalla Gawande, “Development of full-duplex communication system for deaf and dumb people”, 7th International conference on cloud computing, data science, and engineering-Confluence, IEEE 2017
  5. Pet Mirenda, Teresa lacono, “Communication Options for Persons with Severe and Profound Disabilities: State of the Art and Future Directions”, sage journals, Vol.15, issue: 1, pages: 3-21, March 1990, DOI: https://doi.org/10.1177/154079699001500102
  6. Barry M. prizant, Lisa R. Audet, Grace M. Burke, Lauren J. Hummel, Suzanne R. Maher, Geraldine theadore, “Communication Disorders and Emotional/Behavioral Disorders in Children and Adolescents”, Journal of speech and hearing disorders, May 1990, Vol.55,179-192, DOI:10.1044/jshd.5502.179
  7. “Emotion Recognition” [Online] Available: https://en.wikipedia.org/wiki/Emotion_recognition
  8. Salwa Said, Olfa Jemai, Mourad Zaied, Chokri Ben Amar, “Wavelet Networks for Facial Emotion Recognition”, 15th International Conference on Intelligent Systems Design and Applications (ISDA), IEEE 2015, DOI:10.1109/ISDA.2015.7489242
  9. Priya Saha, Debotosh Bhattacharjee, Barin Kumar De, MitaNasipuri, “An Approach to Detect the Region of Interest of Expressive Face Images”, International Conference on Information and Communication Technologies (ICICT 2014), Elsevier Procedia Computer Science 46(2015)17391746, DOI:10.1016/j.procs.2015.02.123
  10. Jyoti Kumari, R.Rajesh, K.M Pooja, “Facial expression recognition: A survey”, Second International Symposium on Computer Vision and the Internet (VisionNet’15), Elsevier Procedia Computer Science 58(2015), 486-491
  11. Ma Xiaoxi, Lin Weisi, Huang Dongyan, Dong Minghui, Haizhou Li, “Facial emotion recognition”, 2nd International Conference on Signal and Image Processing (ICSIP), IEEE, Aug 2017 DOI: 10.1109/SIPROCESS.2017.8124509
  12. Mohan Ghai, Shamit Lal, Shivam Duggal, Shrey Manik, “Emotion recognition on speech signals using machine learning”, International Conference on Big Data Analytics and Computational Intelligence (ICBDAC),2017, DOI: 10.1109/ICBDACI.2017.8070805
  13. Roshan Jameel, Abhishek Singhal, Abhay Bansal, “A Comprehensive study on facial expressions recognition techniques”, 6th International Conference –Cloud System and Big data Engineering (Confluence), IEEE2016, DOI:10.1109/CONFLUENCE.2016.7508167
  14. Ligang Zhang, Dian Tjondronegoro, “Facial expression recognition using facial movement features ”, IEEE Transactions on Affective Computing, Vol.2, Issue No.4,2011, DOI: 10.1109/T-AFFC.2011.13
  15. Shoaib Kamal, Dr. Farrukh Sayeed, Mohammed Rafeeq, “Facial Emotion Recognition for Human-Computer Interactions using hybrid feature extraction technique”, International Conference on Data Mining and Advanced Computing (SAPIENCE), IEEE, March 2016, DOI: 10.1109/SAPIENCE.2016.7684129
  16. Pawel Tarnowski, Marcin Kolodziej, Andrzej Majkowski, Remigiusz J. Rak, “Emotion recognition using facial expressions”, International Conference on Computational Science, ICCS 2017,12-14 June 2017, Zurich Switzerland, DOI: https://doi.org/10.1016/j.procs.2017.05.025
  17. Md. Zia Uddin, Weria Khaksar, Jim Torresen,“ Facial expression recognition using salient features and convolutional neural network”, Vol. 5, IEEE Access, Nov 2017, DOI: 10.1109/ACCESS.2017.2777003
  18. Pramodini A. Punde, Mukti E. Jadhav, Ramesh R. Manza, “A study of eye tracking technology and its applications”, 1st International Conference on Intelligent Systems and Information Management (ICISIM),  IEEE, Oct 2017, DOI: 10.1109/ICISIM.2017.8122153
  19. Amer Al-Rahayfeh, Miad Faezipour, “Eye Tracking and Head Movement Detection: A State-of-Art Survey”, IEEE Journal of Translational Engineering in Health and Medicine (Volume: 1), DOI:10.1109/JTEHM.2013.2289879
  20. Shaun K. Kane, Meredith Ringel Morris, Ann Paradiso, Jon Campbell, “At times avuncular and cantankerous, with the reflexes of a mongoose: Understanding Self-Expression through Augmentative and Alternative Communication Devices”, Proceedings of CSCW 2017, ACM, DOI: http://dx.doi.org/10.1145/2998181.2998284
  21. Alexander Fiannaca, John Campbell, Ann Paradiso, Meredith Ringel Morris, “Voice setting: Voice Authoring Uis for Improved Expressivity in Augmentative Communication”, Proceedings of CHI 2018,ACM,DOI:https://doi.org/10.1145/3173574.3173857
  22. Christos D. Katsis, Nikolaos S. Katersidis, Dimitrios I. Fotiadis, “An integrated system based on physiological signals for the assessment of affective states in patients with anxiety disorders”, Biomedical Signal Processing and Control 6(2011)261-268, Elsevier, DOI:10.1016/j.bspc.2010.12.001
  23. M.T.Quazi, S.C. Mukhopadhyay, N.K.Suryadevara, Y.M. Huang, “Towards the Smart Sensors Based Human Emotion Recognition”, International Instrumentation and Measurement Technology Conference (12 MTC), IEEE 2012, DOI: 10.1109/12MTC.2012.6229646
  24. Eun-Hye Jang, Byoung-Jun Park, Mi-SookPark, SngHyeobKim, Jin Han Sohn, “Analysis of physiological signals for recognition of boredom, pain and surprise emotions”, Journal of Physiological Anthropology (2015)34:25
  25. Kathawut Rojanasaroch, Terravisit Laohapensaeng, “Communication Aid Device for Illness Deaf-Mute”, 12th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology (ECTI-CON), IEEE2015, DOI:10.1109/ECTICon.2015.7207127
  26. “TobiiDynavox I-Series+” [online] Available: https://www.tobii.com/group/news-media/press-releases/tobii-dynavox-launches-the-new-i-series-and-communicator-5-for-even-more-efficient-communication/
  27. Mingmin Zhao, Fadel Adib, Dina Katabi, “Emotion recognition using wireless signals”, MobiCom’16 Proceedings of the 22nd Annual International Conference on Mobile Computing and Networking, ACM, Oct 2016
  28. Elena Simion, “Augmentative and alternative communication - support for people with severe speech disorders”, Procedia – Social and Behavioral Sciences, Vol. 128, April 2014, Pages 77-81, DOI: https://doi.org/10.1016/j.sbspro.2014.03.121
  29. Janice Light and David McNaughton (2012), “The Changing Face of Augmentative and Alternative Communication: Past, Present, and Future Challenges”, Augmentative and Alternative Communication, 28:4, 197-204, DOI: 10.3109/07434618.2012.737024
  30. Jeff Sigafoos, Mark F. O. Reilly, Giulio E. Lancioni, Dean Sutherland, “Augmentative and Alternative Communication for Individuals with Autism Spectrum Disorder and Intellectual Disability”, Springer International Publishing, Jan 2014, Online ISSN: 2196-2987, DOI: https://doi.org/10.1007/s40474-013-0007-x
  31. “Attainment Go Talk Pocket” [Online] Available: https://www.enablemart.com/attainment-gotalk-pocket
  32. “Logan Prox Talker” [Online] Available: https://logantech.com/products/proxtalker
  33. “MegaBee Eye Pointing Communication Tablet” [Online] Available: https://logantech.com/products/megabee
  34. “Beamz Interactive Music System” [Online] Available: https://www.inclusivetlc.com/beamz-interactive-music-system
  35. “Tobii Dynavox I-Series+” [Online] Available: https://www.tobiidynavox.com/devices/eye-gaze-devices/i-15-with-communicator/
  36. G.J. Edwards, C.J. Taylor, T.F. Cootes, “Interpreting face images using active appearance models”, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition,1998, DOI:10.1109/AFGR.1998.670965
  37. “ITU-T Rec. P 800 (08/96) Methods for subjective determination of transmission quality” [Online] Available: https://www.itu.int/rec/T-REC-P.800-199608-I/en
  38. Viswanathan Mahesh and Viswanathan Madhubalan (2005), “Measuring speech quality for text-to-speech systems: Development and assessment of a modified mean opinion score (MOS) scale”, Computer Speech and Language-Elsevier, 19(1): 55-83, DOI:10.1016/j.csl.2003.12.001.
  39. Black Alan W. and Tokuda Keiichi, “The Blizzard Challenge 2005: Evaluating corpus-based speech synthesis on common datasets”, Conference: INTERSPEECH 2005,77-80 - Eurospeech, 9th European Conference on Speech Communication and Technology, Lisbon, Portugal, September 4-8, 2005
  40. Yu Yun Chang, “Evaluation of TTS systems in intelligibility and comprehension tasks”, ROCLING '11 Proceedings of the 23rd Conference on Computational Linguistics and Speech Processing Pages 64-78, Taipei, China-September 08-09, 2011
Index Terms

Computer Science
Information Sciences

Keywords

Facial Emotion Recognition Augmentative and alternative communication Eye-tracking technology Speech disability