CFP last date
22 April 2024
Reseach Article

Review of Literature for the Development of Indian Sign Language Recognition System

by Shweta Dour, M.M. Sharma
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 132 - Number 5
Year of Publication: 2015
Authors: Shweta Dour, M.M. Sharma
10.5120/ijca2015907360

Shweta Dour, M.M. Sharma . Review of Literature for the Development of Indian Sign Language Recognition System. International Journal of Computer Applications. 132, 5 ( December 2015), 27-34. DOI=10.5120/ijca2015907360

@article{ 10.5120/ijca2015907360,
author = { Shweta Dour, M.M. Sharma },
title = { Review of Literature for the Development of Indian Sign Language Recognition System },
journal = { International Journal of Computer Applications },
issue_date = { December 2015 },
volume = { 132 },
number = { 5 },
month = { December },
year = { 2015 },
issn = { 0975-8887 },
pages = { 27-34 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume132/number5/23592-2015907360/ },
doi = { 10.5120/ijca2015907360 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T23:28:22.010920+05:30
%A Shweta Dour
%A M.M. Sharma
%T Review of Literature for the Development of Indian Sign Language Recognition System
%J International Journal of Computer Applications
%@ 0975-8887
%V 132
%N 5
%P 27-34
%D 2015
%I Foundation of Computer Science (FCS), NY, USA
Abstract

In recent years, sign language recognition has attracted much attention in computer vision . A sign language is a means of conveying the message by using hand, arm, body, and face to convey thoughts and meanings. Like spoken languages, sign languages emerge and evolve naturally within hearing-impaired communities. However, sign languages are not universal. There is no internationally recognized and standardized sign language for all deaf people. As is the case in spoken language, every country has got its own sign language with high degree of grammatical variations. The sign language used in India is commonly known as Indian Sign Language (henceforth called ISL).

References
  1. Sigal Berman, and Helman Stern , “Sensors for Gesture Recognition Systems” IEEE Transactions On Systems, Man, And Cybernetics—Part C: Applications And Reviews, Vol. 42, No. 3, MAY 2012 pg 277
  2. Laura Dipietro, Angelo M. Sabatini, and Paolo Dario, “A Survey of Glove-Based Systems and Their Applications” IEEE Transactions On Systems, Man, And Cybernetics—Part C: Applications And Reviews, VOL. 38, NO. 4, JULY 2008 pg 461
  3. Ruize Xu, Shengli Zhou, and Wen J. Li, “MEMS Accelerometer Based Nonspecific-User Hand Gesture Recognition” IEEE Sensors Journal, Vol. 12, No. 5, MAY 2012
  4. Yun Li, Xiang Chen, Xu Zhang, , Kongqiao Wang, and Z. Jane Wang, “A Sign-Component-Based Framework for Chinese Sign Language Recognition Using Accelerometer and sEMG Data” IEEE Transactions On Biomedical Engineering, Vol. 59 No. 10, October 2012 pp 2695-2704
  5. Vasiliki E. Kosmidou, , Panagiotis C. Petrantonakis, and Leontios J. Hadjileontiadis, “Enhanced Sign Language Recognition Using Weighted Intrinsic-Mode Entropy and Signer’s Level of Deafness” IEEE Transactions On Systems, Man, And Cybernetics—Part B: Cybernetics, Vol. 41, No. 6, December 2011 pp 1531-1542
  6. Xu Zhang, Xiang Chen, , Yun Li, Vuokko Lantz, Kongqiao Wang, and Jihai Yang “A Framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors” IEEE Transactions On Systems, Man, And Cybernetics—Part A: Systems And Humans, Vol. 41, No. 6, November 2011
  7. Ganesh R. Naik, Dinesh Kant Kumar, and Jayadeva, “ Twin SVM for Gesture Classification Using the Surface Electromyogram” IEEE Transactions On Information Technology In Biomedicine, Vol. 14, No. 2, MARCH 2010 pg 301
  8. Y Han. “A low-cost visual motion data glove as an input device to interpret human hand gestures”. IEEE Trans Conscum Electron, 56(2):501-509, 2010.
  9. Johannes Wagner, Jonghwa Kim, Matthias Rehm, and Elisabeth Andre. “ Bi-channel sensor fusion for automatic sign language recognition.”8th IEEE International Conference on Automatic Face & Gesture Recognition, pages 1–6, September 2008.
  10. Jonghwa Kim, Johannes Wagner, Matthias Rehm, and Elisabeth Andre. Bi-channel sensor fusion for automatic sign language recognition. 2008 8th IEEE International Conference on Automatic Face & Gesture Recognition, pages 1–6, September 2008.
  11. Coz and M Leu. “Linguistic properties based on American Sign Language isolated word recognition with artificial neural networks using a sensory glove and motion tracker.” NEUROCOMPUTING, 70(16-18):2891–2901, 2007.
  12. Qi Wang, Xilin Chen, Liang-Guo Zhang, Chunli Wang, and Wen Gao. Viewpoint invariant sign language recognition. Computer Vision and Image Understanding, 108(1-2):87–97, 2007.
  13. Duy Bui and Long Thang Nguyen, Recognizing Postures in Vietnamese Sign LanguageWith MEMS Accelerometers IEEE Sensors Journal, VOL. 7, NO. 5, MAY 2007
  14. Holger Kenn, Friedrich Van Megen, and Robert Sugar. A glove-based gesture interface for wearable computing applications. (IFAWC), 2007 4th International Forum on Applied Wearable Computing, pages 1 -10, 2007.
  15. R.M. McGuire, J. Hernandez-Rebollar, T. Starner, V. Henderson, H. Brashear, and D.S. Ross. Towards a one-way American Sign Language translator. Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Pro- ceedings., pages 620–625, 2004.
  16. Wen Gao, Gaolin Fang, Debin Zhao, and Yiqiang Chen. “Transition movement models for large vocabulary continuous sign language recognition.” IEEE FG 2004, pages 553–558, May 2004.
  17. Gaolin Fang, Wen Gao, and Debin Zhao. “Large vocabulary sign language recognition based on hierarchical decision trees.” Proceedings of the 5th international conference on Multimodal interfaces - ICMI ’03, page 125, 2003.
  18. H. Brashear, T. Starner, P. Lukowicz, and H. Junker. “Using multiple sensors for mobile sign language recognition.”Seventh IEEE International Symposium on Wearable Computers, 2003Proceedings., pages 45–52, 2003.
  19. Hossein Hajimirsadeghi, Majid Nili hmadabadi, and Babak Nadjar Araabi , “Conceptual mitation Learning Based on Perceptual and Functional Characteristics of Action.” IEEE Transactions On Autonomous Mental Development .2013
  20. Annamária R. Várkonyi-Kóczy, and Balázs Tusor, “Human–Computer Interaction for Smart Environment Applications Using Fuzzy Hand Posture and Gesture Models”. IEEE Transactions On Instrumentation And Measurement, Vol. 60, No. 5, May 2011 ,1505
  21. Philippe Dreuw, Jens Forster, and Hermann Ney. Tracking benchmark databases for video~based sign language recognition. In ECCV International Workshop on Sign Gesture and Activity SGA Crete Greece September 2010.
  22. Jonathan Alon, Quan Yuan, and Stan Sclaroff, “A Unified Framework for Gesture Recognition and Spatiotemporal Gesture Segmentation” IEEE Transactions On Pattern Analysis And Machine Intelligence, VOL. 31, NO. 9, SEPTEMBER 2009
  23. P. Buehler, M. Everingham, D. P. Huttenlocher, and A. Zisserman. “Long term ann and hand tracking for continuous sign language TV broadcasts.” In British Machine Vision Conference, 2008.
  24. T. Shanableh, K. Assaleh, and M. Al-Rousan. “Spatio-temporal feature-extraction techniques for isolated gesture recognition in Arabic Sign language.” Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on, 37(3):641–650, June 2007
  25. Qi Wang, Xilin Chen, Liang-Guo Zhang, Chunli Wang, and Wen Gao. “Viewpoint invariant sign language recognition.” Computer Vision and Image Understanding, 108(1-2):87–97, 2007.
  26. Ignazio Infantino, Riccardo Rizzo,and Salvatore Gaglio “A Framework for Sign Language Sentence Recognition by Commonsense Context” IEEE Transactions On Systems, Man, And Cybernetics—Part C: Applications And Reviews, VOL. 37, NO. 5, SEPTEMBER 2007
  27. Helen Cooper and Richard Bowden. “Large lexicon detection of sign language.” In ICCV, Workshop Human Comp. Inter, 2007.
  28. T Deselaers ,P Dreuw, , D Rybach, D Keysers, and H Ney. “Tracking using dynamic programming for appearance based sign language recognition.” 7th International Conference on Automatic Face and Gesture Recognition FGR06, 62(1):293-298,2006.
  29. Ulrich von Agris, Daniel Schneider, Jorg Zieren, and Karl-Friedrich Kraiss. “Rapid signer adaptation for isolated sign language recognition.” In CVPRW ’06: Proceedings of the2006 Conference on Computer Vision and Pattern Recognition Workshop, page 159, IEEE ComputerSociety Washington, DC, USA, 2006.
  30. C.-F. Juang and Ksuan-Chun Ku. “A recurrent fuzzy networkfor fuzzy temporal sequence processing and gesture recognition.Systems” Man, and Cybernetics, Part B: Cybernetics, IEEETransactions on, 35(4):646–658, Aug. 2005.
  31. Eng-Jon Ong and Richard Bowden. “A boosted classifier tree for hand shape deteetion.” In Proceedings of the Sixth IEEE international conference on Automatic face and gesture recognition, . IEEE Computer Society FGR' 04, pages 889-894, Washington, DC, USA, 2004.
  32. Gaolin Fang, Wen Gao, and Debin Zhao. “Large vocabulary sign language recognition based on hierarchical decision trees.”Proceedings of the 5th international conference on Multimodal interfaces - ICMI ’03, page 125, 2003.
  33. Ming Hsuan Yang, Narendra Ahuja, and Mark Tabb. “Extraction of 2d motion trajectories and its application to hand gesture recognition.” IEEE PAMI, 24(8):1061–1074, 2002.
  34. T Starner, J Weaver, and A Pentland. “Real-time American sign language recognition using desk and wearable computer based video.” IEEE Transactions onPattern Analysis and Machine Intelligence, 20(12):1371-1375, 1998.
  35. Marco Maisto, Massimo Panella, Member, IEEE, Luca Liparulo, and Andrea Proietti “An Accurate Algorithm for the Identification of Fingertips Using an RGB-D Camera” IEEE Journal On Emerging And Selected Topics In Circuits And Systems, VOL. 3, NO. 2, JUNE 2013
  36. M. Van den Bergh and L. Van Gool. “Comhining RGB and ToF cameras for realtime 3D hand gesture interaction.” In Applications of Computer Vision (WACV), 2011 IEEE Workshop on, pages 66-72, 2011.
  37. Antonis A Argyros and Manolis I A Lourakis. “Binocular hand tracking and reconstruction based on 2d shape matching”, ICPR, pages 207-21O, 2006.
  38. H Nanda and K Fujimura , “Visual tracking llsing depth data”. Conference on Computer Vision and Pattern Recognition Workshop, OO(C):37-37, 2004.
  39. Zhou Ren, Junsong Yuan, Jingjing Meng, and Zhengyou Zhang,” Robust Part-Based Hand Gesture Recognition Using Kinect Sensor”, IEEE Transactions On Multimedia, VOL. 15, NO. 5, AUGUST 2013
  40. Chao Sun, Tianzhu Zhang, Bing-Kun Bao, Changsheng Xu, and Tao Mei, “Discriminative Exemplar Coding for Sign Language Recognition with Kinect” IEEE Transactions On Cybernetics 2013
  41. Cheoljong Yang, Yujeong Jang, Jounghoon Beh, David Han, and Hanseok Ko. “Gesture recognition using depth-based hand tracking for contactless controller application.” In Consumer Electronics (ICCE), 2012 IEEE International Conference pages 297 -298, 2012.
  42. Jagdish L Raheja, Ankit Chaudhary, and Kunal Singal. “Tracking of fingertips and centers of palm using KINECT”. 2011 Third International Conference on Computational Intelligence Modelling Simulation, pages 248--252, 2011.
  43. Jos L. Hernndez-Rebollar, Robert V. Lindeman, and Nicholas Kyriakopoulos. “A multi-class pattern recognition system for practical finger spelling translation” In Proceedings of the 4th IEEE International Conference on Multimodal Interfaces,pages 185-190, 2002.
Index Terms

Computer Science
Information Sciences

Keywords

Sign Language Recognition Isolated sign recognition and Continuous sign language recognition 3D tracking system.