CFP last date
20 May 2024
Call for Paper
June Edition
IJCA solicits high quality original research papers for the upcoming June edition of the journal. The last date of research paper submission is 20 May 2024

Submit your paper
Know more
Reseach Article

A Study of Emotion Recognition for Constructive Learning using Robots

by Veena Vijayan V
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 95 - Number 20
Year of Publication: 2014
Authors: Veena Vijayan V
10.5120/16713-6871

Veena Vijayan V . A Study of Emotion Recognition for Constructive Learning using Robots. International Journal of Computer Applications. 95, 20 ( June 2014), 39-43. DOI=10.5120/16713-6871

@article{ 10.5120/16713-6871,
author = { Veena Vijayan V },
title = { A Study of Emotion Recognition for Constructive Learning using Robots },
journal = { International Journal of Computer Applications },
issue_date = { June 2014 },
volume = { 95 },
number = { 20 },
month = { June },
year = { 2014 },
issn = { 0975-8887 },
pages = { 39-43 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume95/number20/16713-6871/ },
doi = { 10.5120/16713-6871 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T22:19:59.261014+05:30
%A Veena Vijayan V
%T A Study of Emotion Recognition for Constructive Learning using Robots
%J International Journal of Computer Applications
%@ 0975-8887
%V 95
%N 20
%P 39-43
%D 2014
%I Foundation of Computer Science (FCS), NY, USA
Abstract

The robot starts to play a prominent role in the modern world. They can be used in different areas like house hold activities, markets, military purposes, medical fields and schools. The main aim of this paper is to focus the use of robot in the educational field. Better feedback will be obtained through efficient human robot interactions. Human robot interaction defines the ability of the robot to interact efficiently with humans. The robot should monitor the student and analyze the emotions dynamically. The main aim is to maintain a healthy learning rate after recognizing the positive emotions. Real time face tracking and feature extraction is applied to recognize the emotional state of the student.

References
  1. Amarjot Singh, Sri Krishna Karanam,Devinder kumar, 2013. "Constructive Learning for Human Robot Interaction", In IEEE Journals and Magazines Vol 32,Issue 4,pp13-19
  2. J. Han, M. Jo, S. Park, and S. Kim, 2005. "The educational use of home robots for children," in Proc. IEEE Int. Workshop Robot and Human Interactive Communication (ROMAN 2005), , pp. 378–383.
  3. Fumio Hara, 2004. "Artificial Emotion of Face Robot through Learning in Communicative Interactions with Human", in Proc. IEEE Int Workshop Robot and Human Interactive Communication,7-15
  4. R. Murphy, T. Nomura, A. Billard, and J. Burke, 2010. "Human–robot interaction," IEEE Robot. Automat. Mag. , vol. 17, pp. 85–89.
  5. B. Kort, R Reilly, and R. W. Picard, 2001. "An effective model of interplay between emotions and learning: Reengineering educational pedagogy–Building a learning companion," in Proc. IEEE Int. Conf. Advanced Learning Technologies, pp. 43–46.
  6. Y. Yamada, Y. Hirawawa, S. Huang, Y. Umetani, and K. Suita,1997. " Human - Robot Contact in the Safeguarding Space," IEEE/ASME Transactions on Mechatronics, vol. 2, pp. 230-236.
  7. Y. Yamada, T. Yamamoto, T. Morizono, and Y. Umetani,1999. "FTA Based Issues on Securing Human Safety in a Human Robot Coexistence System," presented at IEEE Systems, Man and Cybernetics SMC'99.
  8. Kanda, T. , Hirano, T. , Eaton, D. , & Ishiguro, H. , 2004. "Interactive Robots as Social Partners and Peer Tutors for Children: A Field Trial", Human Computer Interaction, vol. 19, 61~84.
  9. Kiesler S. & Hinds P. , 2004. "Introduction to This Special Issue on Human Robot Interaction", Human-Computer Interaction, vol. 19, pp. 1~8.
  10. H. Taoand T. S. Huang, 1998 "Connected vibrations: A modal analysis approach to non-rigid motion tracking," in Proc. IEEE Conf. Computer Vision and Pattern Recognition, pp. 735–740.
  11. P. Ekman, 1992"An argument for basic emotions," Cogn. Emot. , vol. 6, no. 3–4, pp. 169–300.
  12. Noriaki Mitsunaga, Christian Smith, Takayuki Kanda, Hiroshi Ishiguro, Norihiro Hagita, august 2008. " Adapting Robot Behavior for Human–Robot Interaction", IEEE transactions on robotics, vol. 24, no. 4
  13. N. Najmaei and M. Kermani, 1999. "Super quadric obstacle modeling and a danger evaluation method with K. Ikuta and M. Nokata, "General evaluation method of safety for human care robots," in Proc. IEEE Int. Conf. Robot. Autom. vol. 3,pp. 2065–2072.
  14. R. Bischoff and V. Graefe, 2004. "Hermes—A versatile personal robotic assistant," Proc. IEEE, vol. 92, no. 11, pp. 1759–1779.
  15. Scholtz, J. , 2003"Theory and Evaluation of Human Robot Interactions", Proceeding of Hawaii International Conference on System Science, pp. 36.
  16. Scholtz, J. & Bahrami, S. ,2004. "Human-Robot Interaction: Development of an Evaluation Methodology for the Bystander Role of Interaction", Proceeding of International System, Man, and Cybernetics Conference
  17. K. Ikuta and M. Nokata, 1999. "General evaluation method of safety for human care robots," in Proc. IEEE Int. Conf. Robot. Autom. vol. 3,pp. 2065–2072.
  18. http://www. intechopen. com/books/human-robot-interaction/robot-aided-learning-and-r-learning-services
  19. https://developer. valvesoftware. com/wiki/Character_Facial_Animation_Shapekey_Set
  20. http://www. femininebeauty. info/eyebrow-aesthetics
Index Terms

Computer Science
Information Sciences

Keywords

Face tracking Action units Bezier volume Real time recognition.