CFP last date
20 May 2024
Reseach Article

Survey Paper on Hadoop- using a Biometric Technique “Iris Recognition”

by Umesh S. Soni, Yogesh Wagh, Silkesha Thigale
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 114 - Number 9
Year of Publication: 2015
Authors: Umesh S. Soni, Yogesh Wagh, Silkesha Thigale
10.5120/20005-1944

Umesh S. Soni, Yogesh Wagh, Silkesha Thigale . Survey Paper on Hadoop- using a Biometric Technique “Iris Recognition”. International Journal of Computer Applications. 114, 9 ( March 2015), 11-13. DOI=10.5120/20005-1944

@article{ 10.5120/20005-1944,
author = { Umesh S. Soni, Yogesh Wagh, Silkesha Thigale },
title = { Survey Paper on Hadoop- using a Biometric Technique “Iris Recognition” },
journal = { International Journal of Computer Applications },
issue_date = { March 2015 },
volume = { 114 },
number = { 9 },
month = { March },
year = { 2015 },
issn = { 0975-8887 },
pages = { 11-13 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume114/number9/20005-1944/ },
doi = { 10.5120/20005-1944 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T22:52:14.954506+05:30
%A Umesh S. Soni
%A Yogesh Wagh
%A Silkesha Thigale
%T Survey Paper on Hadoop- using a Biometric Technique “Iris Recognition”
%J International Journal of Computer Applications
%@ 0975-8887
%V 114
%N 9
%P 11-13
%D 2015
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Iris recognition is an automated method of biometric identification. Now in current world most of the iris comparison systems uses sequential and parallel execution but when iris dataset is large i. e. Big Data then in that case it has certain deficiencies like speed, complexity of dividing data, handling large data and robustness. So we are implementing Iris recognition process using an open source technology known as Hadoop. Hadoop technology is based on most popular programming model used to handle big data i. e. MapReduce framework. Hadoop provides specific library for handling large number of images: Hadoop Image Processing Interface (HIPI) and it can be used to implement the proposed system. Hadoop Distributed File System (HDFS) is used to handle large data sets, by breaking it into blocks and replicating blocks on various machines in cluster. Template comparison is done independently on different blocks of data by various machines in parallel. Map/Reduce programming model is used for processing large data sets. Map/Reduce process the data in format. Iris database is stored in a text format. HIPI is a library for Hadoop's MapReduce framework that provides an API for performing image processing tasks in a distributed computing environment.

References
  1. Apache Hadoop. Http://hadoop. apache. org/
  2. Earnst, J. (n. d. ) Iris Recognition Homepage. Retrieved February 15, 2005, from http://www. iris-recognition. org/
  3. IEEE Transactions on Parallel and Distributed Systems, Vol. 25, No. 3, March 2014, "Design and Evaluation of Network-Levitated Merge for Hadoop Acceleration, Weikuan Yu, Member,IEEE, Yandong Wang, and Xinyu Que.
  4. MIRPO 2013, May 20-24,2013 Opatija, Croatia "Using Hadoop MapReduce In A Multicluster Environment", I. Tomasic, A. Rashkovska and M. Depolli Jozef Stefan Institute/Department of Communications Systems, Ljubljana,Slovenia.
  5. "HIPI: A Hadoop Image Processing Interface for Image- based Map Reduce Tasks", Chris Sweeney Liu Liu Sean Arietta Jason Lawrence University of Virginia.
  6. The Hadoop Distributed File System, Konstantin Shvachko, Hairong Kuang, Sanjay Radia, Robert Chansler Yahoo! Sunnyvale, California USA 2010.
Index Terms

Computer Science
Information Sciences

Keywords

Hadoop HDFS MapReduce Iris Recognition HIPI parallel image search