Call for Paper - August 2022 Edition
IJCA solicits original research papers for the August 2022 Edition. Last date of manuscript submission is July 20, 2022. Read More

Human Affect Recognition System based on Survey of Recent Approaches

Print
PDF
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Year of Publication: 2017
Authors:
Shweta Malwatkar, Rekha Sugandhi, Anjali R. Mahajan
10.5120/ijca2017912801

Shweta Malwatkar, Rekha Sugandhi and Anjali R Mahajan. Human Affect Recognition System based on Survey of Recent Approaches. International Journal of Computer Applications 158(6):10-17, January 2017. BibTeX

@article{10.5120/ijca2017912801,
	author = {Shweta Malwatkar and Rekha Sugandhi and Anjali R. Mahajan},
	title = {Human Affect Recognition System based on Survey of Recent Approaches},
	journal = {International Journal of Computer Applications},
	issue_date = {January 2017},
	volume = {158},
	number = {6},
	month = {Jan},
	year = {2017},
	issn = {0975-8887},
	pages = {10-17},
	numpages = {8},
	url = {http://www.ijcaonline.org/archives/volume158/number6/26911-2017912801},
	doi = {10.5120/ijca2017912801},
	publisher = {Foundation of Computer Science (FCS), NY, USA},
	address = {New York, USA}
}

Abstract

In recent years, the analysis of human affective behavior has been a point of attraction for many researchers. Such automatic analysis is useful in various fields such as psychology, computer science, linguistics, neuroscience etc. Such affective computing is responsible for developing standard systems and devices, useful for recognition and interpretation of various human faces and gestures. The emotions are categories as anger, disgust, fear, happiness, sadness and surprise. Such emotion recognition system involves three main steps: face detection, feature extraction and facial expression classification. Hence, there is a need for standard approaches that solve the problem of machines understanding the human affect behavior. This survey paper presents some recent approaches that recognize the human affective behavior, with their advantages and limitations. This paper also presents some basic classifiers such as SVM, ANN, KNN and HMM, used for emotion classification and audiovisual databases with their emotion categories. Based on the survey, an affect recognition system has been proposed that adopts a cognitive semi-supervised approach.

References

  1. Wu, Chung-Hsien, Jen-Chun Lin, and Wen-Li Wei, "Survey on audiovisual emotion recognition: databases, features, and data fusion strategies,"APSIPA Transactions on Signal and Information Processing 3 2014, pp.12.
  2. Mishral, Swati, and Avinash Dhole, "A Survey on Facial Expression Recognition Techniques," International Journal of Science and Research (USR) Index Copernicus Value 20 l3, pp. 6.
  3. Affective Computing. In Wikipedia, Retrived Sept 29,2016.From https://en.wikipedia.org/wiki/Affective_computing.
  4. Suresh, R., and S. Audithan, "Contourlets for facial expression analysis uses one nearest neighbor classifier," Current Trends in Engineering and Technology (ICCTET), 2nd International Conference on. IEEE, 2014.
  5. Tang, Hao, and Thomas S. Huang, "3D facial expression recognition based on properties of line segments connecting facial feature points," IEEE International Conference onAutomatic Face & Gesture Recognition, 2008.
  6. Jun Wang, Lijun Yin, Xiaozhou Wei, and Yi Sun, “3D Facial Expression Recognition Based on Primitive Surface Feature Distribution,” CVPR 2006.
  7. Borah, Sagarika, and Sharmila Konwar, "ANN based human facial expression recognition in color images," International Conference onHigh Performance Computing and Applications (ICHPCA), IEEE, 2014.
  8. Punitha, A., and M. Kalaiselvi Geetha, "HMM Based Real Time Facial Expression Recognition," International Journal of Emergine Technology and Advanced Engineering 3.1 2013, pp. 180-185.
  9. Retrieved Oct 19, 2016, from http://kahlan.eps.surrey.ac.uk/savee/
  10. McKeown, Gary, et al., "The semaine database: Annotated multimodal records of emotionally colored conversations between a person and a limited agent," IEEE Transactions on Affective Computing 3.1, 2012, pp 5-17.
  11. Retrieved Oct 19, 2016, from http://www.kasrl.org/jaffe.html
  12. Cheng, Fei, Jiangsheng Yu, and Huilin Xiong, "Facial expression recognition in JAFFE dataset based on Gaussian process classification," IEEE Transactions on Neural Networks 21.10, 2010, pp. 1685-1690.
  13. Duncan, Dan, Gautam Shine, and Chris English, "Facial Emotion Recognition in Real Time," Stanford University.
  14. Kim, Jonathan C., and Mark A. Clements, "Multimodal affect classification at various temporal lengths," IEEE Transactions On Affective Computing 6.4, 2015, pp. 371-384.
  15. Y. Kim, H. Lee, and E. M. Provost, “Deep learning for robust feature generation in audiovisual emotion recognition,” in Proc. IEEE Int. Conf. Acoust., Speech Signal Process., 2013, pp. 3687–3691.
  16. O. AlZoubi, S. K. D’Mello, and R. A. Calvo, “Detecting naturalistic expressions of nonbasic affect using physiological signals,” IEEE Trans. Affective Comput., 2012, vol. 3, no. 3, pp. 298–310.
  17. S. Kim, P. G. Georgiou, S. Lee, and S. Narayanan, “Real-time emotion detection system using speech: Multi-modal fusion of different timescale features,” in Proc. IEEE 9th Workshop Multimedia Signal Process., 2007, pp. 48–51.
  18. J. Gonzalez-Sanchez et al., "Affect Recognition in Learning Scenarios: Matching Facial- and BCI-Based Values," 2013 IEEE 13th International Conference on Advanced Learning Technologies, Beijing, 2013, pp. 70-71.
  19. M. Soleymani, M. Pantic, and T. Pun, “Multimodal emotion recognition in response to videos,” IEEE Trans. Affective Comput., 2012, vol. 3, no. 2, pp. 211–223.
  20. Carlos Busso, “IEMOCAP: Interactive Emotional Dyadic Motion Capture Database”, Springer Science, 5 nov 2008.
  21. Seng, Kah, Li-Minn Ang, and Chien Ooi. "A Combined Rule-Based and Machine Learning Audio-Visual Emotion Recognition Approach." IEEE Transactions on Affective Computing, TAFFC 2016.
  22. Hong-Bo Deng, Lian-Wen .lin, Li-Xin Zhen, Jian-Cheng Huang, " A New Facial Expression Recognition Method Based On Local Gabor Filter Bank And PCA Plus LDA " , International Journal Of Information Technology Vol. I I No . 11 2005 ,Pp 86-87.
  23. Jure KovaT, Peter Peer, And Franc Solina, "Human Skin Colour Clustering For Face Detection", Digital Image Processing - January 29, 2013, Pp. 1-19.
  24. Borah, Sagarika, and Sharmila Konwar. "ANN based human facial expression recognition in color images." High Performance Computing and Applications (ICHPCA), 2014 International Conference on. IEEE, 2014.
  25. R. L. Hsu, M. Abdel-Mottaleb, A. K. Jain, "Face Detection In Colour Images", IEEE Transactions On Pattern Analysis And Machine Intelligence, Vol. 24, No. 5, Pp . 696-706, 2002.
  26. S .P.Khandait , Dr. R.C.Thool , P .D.Khandait , "Automatic Facial Feature Extraction And Expression Recognition Based On Neural Network" , (IJACSA) International Journal Of Advanced Computer 2011 Science And Applications, Vol. 2, No.1, January.
  27. Akshat Garg, Vishakha Choudhary, "Facial Expression Recognition Using Principal Component Analysis" International Journal Of Scientific Research Engineering &Technology (IJSRET),Volurne 1 Issue4 Pp 039-042 July 2012, ISSN 2278 – 0882
  28. C. A. Bouman, "Connected Component Analysis”, Digital Image Processing - January 29, 2013, Pp. 1-19.
  29. Sagarika Borah, Nitumoni Hazarika, Dr.T.Tuithung , "22 Human Facial Feature Point Detection and Generation of facial feature vector in color images", IEEE International Conference on Advances in Engineering and Technology, ICAET20 14, 978-1- 4799-4949-6 .
  30. Sagarika Borah, Sharmila ](onwar, Dr. T.Tuithung , Rahul Rathi,"A Human Face Detection Method Based on Connected Component Analysis", IEEE International Conference on Communication and Signal Processing-ICCSP'14, , April 3-5, 2014, India, 987-1-4799-3357-0,pp 748-751.
  31. Wang, Liwei, Yan Zhang, and Jufu Feng. "On the Euclidean distance of images." IEEE transactions on pattern analysis and machine intelligence 27.8 (2005): 1334-1339.
  32. Novak, Joseph D., and Alberto J. Cañas. "The theory underlying concept maps and how to construct and use them." (2008).

Keywords

Affective computing, facial emotions, classification, image processing, machine learning.