CFP last date
20 May 2024
Reseach Article

A Context based Gesture Interpretation System

Published on November 2011 by Prof. S.A. Chhabria, Mukta J .Bhatt
2nd National Conference on Information and Communication Technology
Foundation of Computer Science USA
NCICT - Number 8
November 2011
Authors: Prof. S.A. Chhabria, Mukta J .Bhatt
f559c17a-75de-4633-a306-cb7aa78c00bd

Prof. S.A. Chhabria, Mukta J .Bhatt . A Context based Gesture Interpretation System. 2nd National Conference on Information and Communication Technology. NCICT, 8 (November 2011), 32-36.

@article{
author = { Prof. S.A. Chhabria, Mukta J .Bhatt },
title = { A Context based Gesture Interpretation System },
journal = { 2nd National Conference on Information and Communication Technology },
issue_date = { November 2011 },
volume = { NCICT },
number = { 8 },
month = { November },
year = { 2011 },
issn = 0975-8887,
pages = { 32-36 },
numpages = 5,
url = { /proceedings/ncict/number8/4563-ncict064/ },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Proceeding Article
%1 2nd National Conference on Information and Communication Technology
%A Prof. S.A. Chhabria
%A Mukta J .Bhatt
%T A Context based Gesture Interpretation System
%J 2nd National Conference on Information and Communication Technology
%@ 0975-8887
%V NCICT
%N 8
%P 32-36
%D 2011
%I International Journal of Computer Applications
Abstract

Gesture interpretation can be seen as a way for computers to begin to understand human body language, thus building a richer bridge between machines and humans than primitive text user interfaces or even GUIs, which still limit the majority of input to keyboard and mouse. It has also become increasingly evident that the difficulties encountered in the analysis and interpretation of individual sensing modalities may be overcome by integrating them into a multimodal human–computer interface. The different computational approaches that may be applied at the different levels of modality integration. Thus this system is needed for interpreting and fusing multiple sensing modalities in the context of human computer interface. This research can benefit from many disparate fields of study that increase our understanding of the different human communication modalities and their potential role in Human Computer Interface which can be used for handicapped persons to control their wheel-chair, expert to have computer assisted surgery, mining etc.

References
  1. Yan Meng and Yuyang Zhang and Yaochu Jin ?Autonomous Self –Reconfiguration of Modular Robots by evolving a Heirarchical Model 2011.
  2. Rajeev Sharma, Mohammed Yeasin, Member, Ieee, Nils Rahnstoever, Ingmar Rauschert, Guoray Cai, Member, Ieee, Isaac Brewer, Alan M. Maceachren, And Kuntal Sengupta, ?Speech–Gesture Driven Multimodal Interfaces for Crisis Management Proc Of The Ieee, Vol. 91, No. 9, September 2003.
  3. S.A. Chhabria and R.V. Dharaskar, ?Multimodal interface for disabled persons in International Journal of Computer Science and Communication ,2011.
  4. Boucher, R. Canal, T.-Q. Chu, A. Drogoul, B. Gaudou, V.T. Le, V.Moraru, N. Van Nguyen, Q.A.N. Vu, P. Taillandier, F. Sempe, and S. Stinckwich. : A Real-Time Hand Gesture System based on Evolutionary Search . In Safety, Security Rescue Robotics (SSRR), 2011 IEEE International Workshop on, pages 16, 2011.
  5. M. Segers, James Connan, ?Real-Time Gesture Recognition using Eigenvectors Vaughn Private Bag X17 Bellville, 7535, volume III,2009.
  6. Rami Abielmona ,Emilm.Petriu, Moufid Harb and Slawo Yesolkowki, Mission Diven Robotics for Territorial Security Model IEEE transaction on Computational Intelligence Magzine ,pp 55-67 Feb 2011.
  7. Melody Moh, Benjamin Culpepper,Lang Daga, ?Computer Vision and Pattern Recognition ?IEEE , 2005. CVPRW =05.Conference on, page 158, June 2005.
  8. Boukje Habets, Sotaro Kita, Zeshu Shao, Asli Özyurek, and Peter Hagoort ?The Role of Synchrony and Ambiguity in Speech–Gesture Integration during Comprehension 2011.
  9. R. Sharma, V. I. Pavlovic, and T. S. Huang, ?Toward multimodal human-computer interface, Proc. IEEE, vol. 86, pp. 853–869, May 1998.
  10. R. Stiefelhagen, C. F¨ugen, P. Gieselmann, H. Holzapfel, K. Nickel and A. Waibel ?Natural Human-Robot Interaction using Speech, Head Pose and Gestures Proceedings of the Third IEEE International Conference on Humanoid Robots - Humanoids 2003.
  11. Benoit Legrand, C.S. Chang, S.H. Ong, Soek- Ying Neo, Nallasivam Palanisamy, ?Chromosome classification using dynamic time warping , ScienceDirect Pattern Recognition Letters 29Dec 2008.
  12. Mohammad Hasanuzzaman, Saifuddin Mohammad Tareeq, Vuthichai Ampornaramveth, Hironobu Gotoda ? Adaptive Visual Gesture Recognition For Human-Robot Interaction Malaysian Journal Of Computer Science, Vol. 20(1), 2007
  13. Ville Rantanen, Toni Vanhala, Outi Tuisku, Pekka-Henrik Niemenlehto, Jarmo Verho, Veikko Surakka, Martti Juhola, and Jukka Lekkal ?A Wearable, Wireless Gaze Tracker with Integrated Selection Command Source for Human–Computer Interaction IEEE transaction on Computational Intelligence Magzine ,2011 .
  14. Marcelo Worsley And Michael Johnston Multimodal Interactive Spaces: Magictv And Magicmap IEEEVol 978-1-4244-7903-2010
  15. Y. Tamura, M. Sugi, J. Ota, and T. Arai, ?Estimation of user‘s intention inherent in the movements of hand and eyes for the deskwork support system, in IEEE/RSJ IROS, (USA), pp. 3709–3714, Nov. 2007.
Index Terms

Computer Science
Information Sciences

Keywords

Human–computer interface multimodality