CFP last date
20 March 2024
Reseach Article

Sign Energy Images for Recognition of Sign Language at Sentence Level

by Chethana Kumara B.M., Nagendraswamy H.S.
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 139 - Number 2
Year of Publication: 2016
Authors: Chethana Kumara B.M., Nagendraswamy H.S.
10.5120/ijca2016909118

Chethana Kumara B.M., Nagendraswamy H.S. . Sign Energy Images for Recognition of Sign Language at Sentence Level. International Journal of Computer Applications. 139, 2 ( April 2016), 44-51. DOI=10.5120/ijca2016909118

@article{ 10.5120/ijca2016909118,
author = { Chethana Kumara B.M., Nagendraswamy H.S. },
title = { Sign Energy Images for Recognition of Sign Language at Sentence Level },
journal = { International Journal of Computer Applications },
issue_date = { April 2016 },
volume = { 139 },
number = { 2 },
month = { April },
year = { 2016 },
issn = { 0975-8887 },
pages = { 44-51 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume139/number2/24466-2016909118/ },
doi = { 10.5120/ijca2016909118 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T23:39:54.240334+05:30
%A Chethana Kumara B.M.
%A Nagendraswamy H.S.
%T Sign Energy Images for Recognition of Sign Language at Sentence Level
%J International Journal of Computer Applications
%@ 0975-8887
%V 139
%N 2
%P 44-51
%D 2016
%I Foundation of Computer Science (FCS), NY, USA
Abstract

In this paper, the task of sign language recognition at sentence level is addressed. The idea of Sign Energy Image (SEI) and a method of extracting Fuzzy-Gaussian Local Binary Pattern (FzGLBP) features from SEI to characterize the sign are explored. The suitability of interval valued type symbolic data for efficient representation of signs in the knowledgebase is studied. A Chi-square proximity measure is used to establish matching between reference and test signs. A simple nearest neighbor classification technique is used for recognizing signs. Extensive experiments are conducted to study the efficacy of the proposed system. A data base of signs called UoM-ISL is created for experimental analysis.

References
  1. O. Al-Jarrah, A. Halawani, Recognition of gestures in Arabic sign language using neuro-fuzzy systems, Artif. Intell. 133 (1-2) 117–138, 2001.
  2. M. Al-Roussan, M. Hussain, Automatic recognition of Arabic sign language finger spelling, Int. J. Comput. Appl. (IJCA) 8 (2) 80–88 (Special issue on Fuzzy Systems), 2001.
  3. M. Al-Roussan, K. Assaleh, A. Talaa, 2009: Videobased Signer independent Arabic sign language recognition using hidden-Markov models, Appl. Softw. Comput. 9 990–999,2009.
  4. O. Aran, T. Burger, A. Caplier, L. Akarun, A belief-based sequential fusion approach for fusing manual signs and non-manual signals, Pattern Recognit. 42 812–822, 2009.
  5. Aryanie, D. and Heryadi, Y. American sign language-based finger-spelling recognition using k-Nearest Neighbors classifier. 3rd International Conference on Information and Communication Technology (ICoICT ), 2015.
  6. K. Assaleh, T. Shanableh, M. Fanaswala, F. Amin and H. Bajaj. Continuous Arabic Sign Language Recognition in User Dependent Mode. Journal of Intelligent Learning Systems and Applications, Vol. 2 No. 1, pp. 19-27, 2010.
  7. B. Bauer, H. Hienz, Relevant features for video-based continuous sign language recognition, in: FG00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition, pp. 440-445, 2000.
  8. B. Bauer, K.F. Kraiss, Video-based sign recognition using self-organizing subunits. Proceedings of the 16th International Conference on Pattern Recognition, pp. 434–437, 2002.
  9. R. Bowden, D. Windridge, T. Kadir, A. Zisserman, M. Brady, A linguistic feature vector for the visual interpretation of sign language, in: Proceedings of the Eighth European Conference on Computer Vision, pp. 391–401,2004.
  10. Brashear, H., Starner, T., Lukowicz, P., & Junker, H. Using multiple sensors for mobile sign language recognition. Georgia Institute of Technology,2003.
  11. C-C. Chang, J.J. Chen, W.K. Tai, C.C. Han, New approach for static gesture recognition, J. Inf. Sci. Eng. 22,1047–1057,2006.
  12. H. Cooper, B. Holt, R. Bowden, Sign language recognition, Chapter in Visual Analysis of Humans: Looking at People, pp. 539–562, 2011.
  13. Daniel Kelly, John McDonald., A person independent system for recognition of hand postures used in sign language. Charles Markham., Pattern Recognition Letters 31 (2010) pp1359–1368, 2010.
  14. César Roberto de Souza and Ednaldo Brigante Pizzolato. Sign language recognition with support vector machines and hidden conditional random fields: going from fingerspelling to natural articulated words. MLDM'13 Proceedings of the 9th international conference on Machine Learning and Data Mining in Pattern Recognition. Pages 84-98,2013.
  15. Djamila ahmani and Slimane Larabi.,User-independent system for sign language finger spelling recognition Journal of Visual Communication and Image Representation Vol 25, Iss 5,pp1240–1250, July 2014.
  16. P. Dreuw, D. Stein, T. Desealers, D. Rybach, M. Zahedi, H. Ney, Spoken language processing techniques for sign language recognition and translation, Technol. Disability 20,2008.
  17. Ebling, S., Wolfe, R., Schnepp, J., Baowidan, S., McDonald, J., Moncrief, R., ... & Tissi, K. Synthesizing the finger alphabet of Swiss German Sign Language and evaluating the comprehensibility of the resulting animations. In 6th Workshop on Speech and Language Processing for Assistive Technologies (SLPAT) (p. 10), September, 2015.
  18. The American sign language hand shape dictionary, Gallaudet University,1998.
  19. W. Gao, G.L. Fang, D.B. Zhao, Y.Q.A. Chen, A Chinese sign language recognition system based on SOFM/SRN/HMM, Pattern Recognit. 37,2389–2402, 2004
  20. Gasparini, Francesca, and Raimondo Schettini. "Skin segmentation using multiple thresholding." Electronic Imaging 2006. International Society for Optics and Photonics, 2006.
  21. Gowda, K. Chidananda, and Edwin Diday.: Symbolic clustering using a new dissimilarity measure. Pattern Recognition, 24.6: 567-578, 1991.
  22. L. Gu, J. Su, Natural hand posture classification based on Zernike moments and hierarchical classifier, IEEE Int. Conf. Robotics Autom. 3088–3093, 2008.
  23. Zhenhua Guo, Lei Zhang, David Zhang. Rotation invariant texture classification using LBP variance (LBPV) with global matching”. Pattern Recognition 43, pp.706–719, 2010.
  24. Guru, D. S., and Suraj, M. G. Recognition of postal codes from fingerspelling video sequence. International Journal of Image and Graphics. 2009.
  25. Guru, D. S., and H. S. Nagendraswamy.: Clustering of interval-valued symbolic patterns based on mutual similarity value and the concept of k-mutual nearest neighborhood. Computer Vision–ACCV 2006. Springer Berlin Heidelberg,. 234-243, 2006.
  26. Handouyahia, M., Ziou, D., Wang, S., Sign language recognition using moment-based size functions. In: Proc. Intl. Conf. on Vision Interface, pp.210–216, 1999.
  27. Eun-Jung Holden, Gareth Lee, and Robyn Owens. Australian sign language recognition. Mach. Vision Appl. 16, 5, 312-320, December 2005.
  28. Nagendraswamy, H. S., Chethana Kumara B M., Guru, D. S., & Naresh, Y. G. Symbolic Representation of Sign Language at Sentence Level. IJIGSP, 9, 49-60. DOI: 10.5815/ijigsp.2015.09.07 2015.
  29. Ignazio Infantino, Riccardo Rizzo, and Salvatore Gaglio. A framework for sign language sentence recognition by commonsense context”. Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on 37. No 5 1034-1039, 2007.
  30. A. Just, Y. Rodriguez, S. Marcel, Hand posture classification and recognition using the modified census transform, in: 7th Internat. Conf. on Automatic Face and Gesture Recognition, FGR. Pp. 351–356, 2006.
  31. M.W. Kadous, Machine recognition of Australian signs using Powergloves: Towards large-lexicon recognition of sign languages, in: Workshop on the Integration of Gestures in Language and Speech, Wilmington Delaware, 1996.
  32. Kong W W., Surendra Ranganath. Towards subject independent continuous sign language recognition: A segment and merge approach. Pattern Recognition 47 (2014) 1294–1308, 2014.
  33. D. Kelly, J. McDonald, C. Markham, A person independent system for recognition of hand postures used in sign language, Pattern Recognit. Lett.31, 1359–1368, 2010.
  34. J.S. Kim, W. Jang, Z. Bien, A dynamic gesture recognition system for the Korean sign language (KSL), IEEE Trans. Syst. Man Cybern. 26 (2) 354–359, 1996.
  35. Kindiroglu, A. A., Yalcin, H., Aran, O., Hrúz, M., Campr, P., Akarun, L., & Karpov, A. Automatic recognition fingerspelling gestures in multiple languages for a communication interface for the disabled. Pattern Recognition and Image Analysis, 22(4), 527-536, 2012.
  36. Wun-Guang Liou and Chung-Yang Hsieh and Wei-Yang Lin. Trajectory-based sign language recognition using Discriminant Analysis in higher-dimensional feature space. IEEE International Conference on Multimedia and Expo (ICME), pp1-4,2011.
  37. Mohan kumar H P and H S Nagendraswamy. Change Energy Image for Gait Recognition: An approach based on symbolic representation. Int. J. Image Graphics Signal Proc. (IJIGSP) 6 (4) 1-8, 2014.
  38. Sylvie C.W. Ong and Surendra Ranganath. Automatic Sign Language Analysis: A Survey and the Future beyond Lexical Meaning. IEEE transactions on pattern analysis and machine intelligence, vol. 27, no. 6, june 2005.
  39. S.P. Priyal, P.K. Bora, A study on static hand gesture recognition using moments, in: IEEE International Conference on Signal Processing and Communications (SPCOM), pp. 1–5, 2010.
  40. Ross, T. J.: Fuzzy logic with engineering applications. John Wiley & Sons. 2009.
  41. T. Starner, A. Pentland, Real-time merican sign language recognition from video using hidden Markov models, in: AAAI Fall Symposium on Disabilities, Cambridge, MA, 1996.
  42. Suraj and Guru D S. Secondary diagonal FLD for fingerspelling recognition. International Conference on Computing: Theory and Applications, ICCTA'07, 2007.
  43. Tolba, M. F., Ahmed Samir, and Magdy Aboul-Ela. Arabic sign language continuous sentences recognition using PCNN and graph matching”. Neural Computing and Applications 23.3-4: 999-1010, 2013.
  44. J. Triesch, C. Von der malsuburg. Classification of hand postures against complex backgrounds using elastic graph matching, Image Vision Comput. 20 (13-14) 937–943. 2002.
  45. Nagendraswamy, H. S., BM Chethana Kumara, and R. Lekha Chinmayi. "GIST Descriptors for Sign Language Recognition: An Approach Based on Symbolic Representation." Mining Intelligence and Knowledge Exploration. Springer International Publishing, 2015. 103-114.
Index Terms

Computer Science
Information Sciences

Keywords

Fuzzy Gaussian LBP Interval valued features Sign Energy Image Sign language Video sequence