CFP last date
20 June 2024
Call for Paper
July Edition
IJCA solicits high quality original research papers for the upcoming July edition of the journal. The last date of research paper submission is 20 June 2024

Submit your paper
Know more
Reseach Article

Control Robots using Red Hands: A Human-Robot Interaction System using Human Hand Motions

by Mostafa Korashy, Mahmoud Afifi
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 115 - Number 14
Year of Publication: 2015
Authors: Mostafa Korashy, Mahmoud Afifi
10.5120/20217-2491

Mostafa Korashy, Mahmoud Afifi . Control Robots using Red Hands: A Human-Robot Interaction System using Human Hand Motions. International Journal of Computer Applications. 115, 14 ( April 2015), 7-11. DOI=10.5120/20217-2491

@article{ 10.5120/20217-2491,
author = { Mostafa Korashy, Mahmoud Afifi },
title = { Control Robots using Red Hands: A Human-Robot Interaction System using Human Hand Motions },
journal = { International Journal of Computer Applications },
issue_date = { April 2015 },
volume = { 115 },
number = { 14 },
month = { April },
year = { 2015 },
issn = { 0975-8887 },
pages = { 7-11 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume115/number14/20217-2491/ },
doi = { 10.5120/20217-2491 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T22:54:47.225392+05:30
%A Mostafa Korashy
%A Mahmoud Afifi
%T Control Robots using Red Hands: A Human-Robot Interaction System using Human Hand Motions
%J International Journal of Computer Applications
%@ 0975-8887
%V 115
%N 14
%P 7-11
%D 2015
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Human-robot interaction is an evolving area of research in the past few years. Human-robot interaction deals with how humans can interact with, send data to, or receive data from robots. One of the major obstacles in this ?eld is how the robot can obtain the depth information of the surrounding objects. Few years ago, Microsoft has released a depth sensor that computes the depth information using IR rays. Many researches are conducted to control robots using depth sensors, such as Microsoft Kinect and Asus Xition. Although depth sensors are considered to be low cost, it may be unavailable for many users. In this work, we develop a low-cost system for controlling robots (iRobot) with a web-cam and just red markers on the user's hands. Our system requires no extra devices or hardware or other complex technologies. Experimental results of the proposed system demonstrate good results compared to those provided by depth sensors.

References
  1. Spong M. W. and Fujita M. Control in robotics. The Impact of Control Technology: Overview, Success Stories, and Research Challenges (T. Samad and A. Annaswamy, eds. ), IEEE Control Systems Society, 2011.
  2. Monika Jain, AshwaniLohiyaAditi, Mohammad Fahad Khan, and AbhishekMaurya. Wireless gesture control robot: An analysis. Interna-tional Journal of Research in Computer and Communication Engineering, 1(10), 2012.
  3. J Norberto Pires. Robot-by-voice: experiments on commanding an industrial robot using the human voice. Industrial Robot: An International Journal, 32(6):505–511, 2005.
  4. J-B Gomez, AlexànderCeballos, FlavioPrieto, and TanneguyRedarce. Mouth gesture and voice command based robot command interface. In Robotics and Automation, 2009. ICRA'09. IEEE International Conference on, pages 333–338. IEEE, 2009.
  5. Liu Zheng, Liu Yuliang, and Bing Zhigang. Vision-based human-computer interaction utilized in the robot control. In Intelligent Systems, 2009. GCIS'09. WRI Global Congress on, volume 2, pages 155–158. IEEE, 2009.
  6. Min-Chi Kao and TS Li. Design and implementation of interaction system between humanoid robot and human hand gesture. In SICE Annual Conference 2010, Proceedings of, pages 1616–1621. IEEE, 2010.
  7. PradeepShenoy, Kai J Miller, Beau Crawford, and Rajesh PN Rao. Online electromyographic control of a robotic prosthesis. Biomedical Engineering, IEEE Transactions on, 55(3):1128–1135, 2008.
  8. Zhengyou Zhang. Microsoft kinect sensor and its effect. MultiMedia, IEEE, 19(2):4–10, 2012.
  9. Michal Tölgyessy and Peter Hubinsk?. The kinect sensor in robotics education. In Proceedings of 2nd International Conference on Robotics in Education, pages 143–146, 2011.
  10. Qingyu Li and Panlong Yang. Keep up with me: A gesture guided moving robot with microsoftkinect. In Mobile Ad-Hoc and Sensor Systems (MASS), 2013 IEEE 10th International Conference on, pages 435–436. IEEE, 2013.
  11. Guanglong Du, Ping Zhang, Jianhua Mai, and Zeling Li. Markerlesskinect-based hand tracking for robot teleoperation. Int J Adv Robotic Sy, 9(36), 2012.
  12. Wei-chao Chen. Real-Time Palm Tracking and Hand Gesture Estimation Based on Fore-Arm Contour. PhD thesis, Master dissertation, Department of Computer Science and Information Engineering, National Taiwan University of Science and Technology, Taipei, Taiwan, 2011.
  13. Hui-Shyong Yeo, Byung-Gook Lee, and Hyotaek Lim. Hand tracking and gesture recognition system for human-computer interaction using low-cost hardware. Multimedia Tools and Applications, pages 1–29, 2013.
  14. Michal Choras'. Ear biometrics based on geometrical feature extraction. Electronic letters on computer vision and image analysis, 5(3):84–95, 2005.
  15. Son Lam Phung, AbdesselamBouzerdoum, and Douglas Chai. A novel skin color model in ycbcr color space and its application to human face detection. In Image Processing. 2002. Proceedings. 2002 International Conference on, volume 1, pages I–289. IEEE, 2002.
Index Terms

Computer Science
Information Sciences

Keywords

HRI IRobot Create Color detection.