CFP last date
20 May 2024
Reseach Article

An HMM based Model for Prediction of Emotional Composition of a Facial Expression using both Significant and Insignificant Action Units and Associated Gender Differences

by Suvashis Das, Koichi Yamada
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 45 - Number 11
Year of Publication: 2012
Authors: Suvashis Das, Koichi Yamada
10.5120/6823-9277

Suvashis Das, Koichi Yamada . An HMM based Model for Prediction of Emotional Composition of a Facial Expression using both Significant and Insignificant Action Units and Associated Gender Differences. International Journal of Computer Applications. 45, 11 ( May 2012), 11-18. DOI=10.5120/6823-9277

@article{ 10.5120/6823-9277,
author = { Suvashis Das, Koichi Yamada },
title = { An HMM based Model for Prediction of Emotional Composition of a Facial Expression using both Significant and Insignificant Action Units and Associated Gender Differences },
journal = { International Journal of Computer Applications },
issue_date = { May 2012 },
volume = { 45 },
number = { 11 },
month = { May },
year = { 2012 },
issn = { 0975-8887 },
pages = { 11-18 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume45/number11/6823-9277/ },
doi = { 10.5120/6823-9277 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T20:37:21.303765+05:30
%A Suvashis Das
%A Koichi Yamada
%T An HMM based Model for Prediction of Emotional Composition of a Facial Expression using both Significant and Insignificant Action Units and Associated Gender Differences
%J International Journal of Computer Applications
%@ 0975-8887
%V 45
%N 11
%P 11-18
%D 2012
%I Foundation of Computer Science (FCS), NY, USA
Abstract

The problem of emotion prediction from the face is twofold. First, it requires that the facial Action Units (AUs) and their intensities are identified and second interpreting the recorded AUs and their intensities as emotions. This work focuses on developing an accurate model to predict emotions from Facial Action Coding System(FACS) coded facial image data based on a Hidden Markov Model (HMM)approach. The novelty of this work is: 1) A new and more accurate model for emotion prediction from AU data is proposed by assigning a set of N HMMs to every AU where N is the number of emotions we consider while conventional studies have assigned at most one HMM per AU or lesser like 6 emotion specific HMMs for the entire set of AUs [3-6]. Assigning N HMMs per AU takes away the errors that might creep in due to non-consideration of the insignificant or non-present AUs by calculating separately the probability contributions towards each emotion by every single AU in the entire AU set which is used later to calculate the mean probability for each emotion considering all AUs together. 2) A percentage score of each emotion that composed the face of a subject is predicted rather than to just identify the lead or prominent emotion from the maximum probability considerations as exhibited my majority of similar researches. 3) Discuss the gender differences in the depiction of emotion by the face.

References
  1. Ekman P. &Friesen W. V. , 1978. Facial Action Coding System: Investigator's Guide, Consulting Psychologists Press, Palo Alto, CA.
  2. Ekman P. , Friesen W. V, &Hager J. C. , 2002. The New Facial Action Coding System (FACS), Research Nexus division of Network Information Research Corporation.
  3. Hu T. , De Silva L. C. , Sengupta K. , 2002. A hybrid approach of NN and HMM for facial emotion classification, Pattern Recognition Letters, Volume 23, Issue 11, pp. 1303-1310.
  4. CohenI. , GargA. , & HuangT. S. , 2000. Emotion Recognition from Facial Expressions using Multilevel HMM, Science And Technology, Citeseer.
  5. Mase K. , 1991. Recognition of facial expression from optical flow, IEICE Transactions, E74 (10), pp. 3474–3483.
  6. Otsuka T. and Ohya J. , 1996. Recognition of Facial Expressions Using HMM with Continuous Output Probabilities, Proceedings International Workshop Robot and Human Comm. , pp. 323-328.
  7. Darwin C. , 1898. The Expression of the Emotion of man and animals, D. Appleton & Co. , New York.
  8. Plutchik R. , 2002. Emotions and Life Perspectives from Psychology, Biology, and Evolution, American Psychological Association,Washington DC.
  9. Kanade T. , Cohn J. , &Tian Y. L. , 2000. Comprehensive database for facial expression analysis, Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition (FG'00), Grenoble, France, pp. 46-53
  10. Lucey P. , CohnJ. F. ,Kanade T. ,Saragih J. ,Ambadar Z. & Matthews I. , 2010. The Extended Cohn-Kanade Dataset (CK+): A complete expression dataset for action unit and emotion-specified expression,Proceedings of the Third International Workshop on CVPR for Human Communicative Behavior Analysis, San Francisco, USA, pp. 94-101.
  11. Matsumoto D. , 1992. More evidence for the universality of a contempt expression, Motivation& Emotion, vol. 16, pp. 363-368.
  12. Ekman P. & Heider K. G. , 1988. The universality of a contempt expression: A replication, Motivation & Emotion, vol. 12, pp. 303-308.
  13. Dai K. , Fell H. J. & Macauslan J. , 2008. Recognizing emotion in speech using neural networks, In Proceedings of the IASTED International Conference on Telehealth/Assistive Technologies, Ronald Merrell (Ed. ). ACTA Press, Anaheim, CA, USA, pp. 31-36.
  14. Wilhelm F. H. , Pfaltz M. C. & Grossman P. , 2006. Continuous electronic data capture of physiology, behavior and experience in real life: towards ecological momentary assessment of emotion. Interacting With Computers, vol. 18 issue. 2, pp. 171-186.
  15. Nusseck M. , Cunningham D. W. , Wallraven C. & Bulthoff H. H. , 2008. The contribution of different facial regions to the recognition of conversational expressions, Journal of Vision.
  16. Black M. J. & Yacoob Y. , 1997. Recognizing facial expressions in image sequences using local parameterized models of image motion. International Journal of Computer Vision, vol. 25 issue. 1, pp. 23–48.
  17. Essa I. A. & Pentland A. P. , 1997. Coding, analysis, interpretation, and recognition of facial expressions, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19 issue. 7, pp. 757–763.
  18. Kimura S. & Yachida M. , 1997. Facial expression recognition and its degree estimation. In Proceedings of the 1997 conference on computer vision and pattern recognition, Washington, DC, USA: IEEE Computer Society, pp. 295.
  19. Hong H. , Neven H. & von der Malsburg C. , 1998. Online facial expression recognition based on personalized galleries. In Proceedings of the 3rd international conference on face & gesture recognition, Washington, DC, USA: IEEE Computer Society, pp. 354.
  20. Ekman P. , 1999. Facial expressions, The handbook of cognition and emotion, UK: John Wiley & Sons Ltd. , pp. 301–320.
  21. Fasel B. & LuettinJ. , 2003. Automatic facial expression analysis: A survey, Pattern Recognition, vol. 36 issue. 1, pp. 259–275.
  22. PanticM. &RothkrantzL. J. M. , 2000. Automatic Analysis of Facial Expressions: The State of the Art, IEEE Transactions on Pattern Analysis and Machine Intelligence, pp. 1424-1445.
  23. AzcarateA. , HagelohF. , SandeK. van de& ValentiR. ,2005. Automatic facial emotion recognition, Universiteit van Amsterdam.
  24. KhademiM. et al. , 2010. Recognizing Combinations of Facial Action Units with Different Intensity Using a Mixture of Hidden Markov Models and Neural Network,Proceedings of 9th International Workshop on Multiple Classifier Systems, (MCS 2010), Springer-LNCS, vol. 5997, pp. 304-313.
  25. Das S. & Yamada K. , 2011. A Hidden Markov Model Based Approach to Identify Emotion from Facial Expression Using a Combination of Emotion Probability Classifier and Facial Action Unit Intensity Classifier, 12th International Symposium on Advanced Intelligent Systems, Suwon, Korea.
  26. BrowndykeJ. N. , 2002. Neuropsychological factors in emotion recognition: Facial expressions,www. NeuropsychologyCentral. Com.
  27. Belk S. S. & Snell Jr. W. E. , 1986. Beliefs about women: Components and correlates, Personality and Social Psychology Bulletin, vol. 12, pp. 403–413.
  28. HessU. et al. , 2000. Emotionalexpressivity in men and women: Stereotypes and self-perceptions, Cognition & Emotion, vol. 14, pp. 609–642.
  29. FabesR. A. &MartinC. L. , 1991. Gender and age stereotypes of emotionality, Personality and Social Psychology Bulletin, vol. 17, pp. 532–540, 1991.
  30. FischerA. H. , 1993. Sex differences in emotionality: Fact or stereotype?, Feminism Psychology, vol. 3, pp. 303–318.
  31. Grossman M. & WoodW. , 1993. Sex differences in intensity of emotional experience: A social role interpretation, Journal of Personality and Social Psychology, vol. 65, pp. 1010–1022.
  32. Barrett L. F. , Robin L. , PietromonacoP. R. & Eyssell K. M. , 1998. Are women the 'more emotional' sex? Evidence from emotional experiences in social context," Cognition & Emotion, vol. 12, pp. 555–578.
  33. Robinson M. D. , Johnson J. T. & Shields S. A. , 1998. The gender heuristic and the database: Factors affecting the perception of genderrelated differences in the experience and display of emotions, Basic and Applied Social Psychology, vol 20, pp. 206–219.
  34. Fujita F. et al. , 1991. Gender differences in negative affect and well-being: The case for emotional intensity, Journal of Personality and Social Psychology, vol. 61, pp. 427–434.
  35. PlantE. A. , HydeJ. S. , KeltnerD. & DevineP. G. , 2000. The gender stereotyping of emotions, Psychology of Women Quarterly, vol. 24, pp. 81–92.
  36. TimmersM. , FischerA. & MansteadA. , 2003. Ability versus vulnerability: Beliefs about men's and women's emotional behavior, Cognition & Emotion, vol. 17, pp. 41–63.
  37. Shields A. S. , 2003. Speaking from the heart: Gender and the social meaning of emotion. New York: Cambridge University Press.
  38. Algoe S. B. , Buswell B. N. & DeLamater J. D. , 2000. Gender and job status as contextual cues for the interpretation of facial expression of emotion – statistical data included, Sex Roles, vol. 42, pp. 183–208.
  39. Hess U. , Adams Jr. R. B &Kleck R. E. , 2004. Facial appearance, gender, and emotion expression, Emotion, vol. 4, pp. 378–88.
  40. Plant E. A. , Kling K. C. & Smith G. L. , 2004. The influence of gender and social role on the interpretation of facial expressions, Sex Roles, vol. 51, pp. 187–96.
  41. Simon D. , Craig K. D. , Gosselin F. , BelinP. & Rainville P. , 2008. Recognition and discrimination of prototypical dynamic expressions of pain and emotions, Pain, vol. 135(1-2), pp. 55-64.
  42. Crowder?M. , Davis M. & Giampieri G, 2005. A Hidden Markov Model of Default Interaction, Second International Conference on Credit Risk, Citeseer.
  43. Perl. J, 1988. Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference, Morgan Kaufmann Pub, San Francisco.
  44. Rabiner L. R. , 1989. A tutorial on Hidden Markov Models and selected applications in speech recognition, IEEE Proceedings 77 (2), pp. 257–286.
  45. Welch L. R. , 2003. Hidden Markov Models and the Baum–Welch Algorithm, The Shannon Lecture. IEEE Information Theory Society Newsletter.
  46. Mingli Song et al. , 2010. Image Ratio Features for Facial Expression Recognition Application, IEEE Transactions on System, Man & Cybernetics, Part B, 40(3): 779-788.
Index Terms

Computer Science
Information Sciences

Keywords

Facs Action Units Hidden Markov Model Plutchik's Wheel Of Emotions Baum-welch Algorithm Forward-backward Procedure Ck+ Database