CFP last date
22 April 2024
Reseach Article

Enhancing the Traditional File System to HDFS: A Big Data Solution

by Himani Saraswat, Neeta Sharma, Abhishek Rai
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 167 - Number 9
Year of Publication: 2017
Authors: Himani Saraswat, Neeta Sharma, Abhishek Rai
10.5120/ijca2017914367

Himani Saraswat, Neeta Sharma, Abhishek Rai . Enhancing the Traditional File System to HDFS: A Big Data Solution. International Journal of Computer Applications. 167, 9 ( Jun 2017), 12-14. DOI=10.5120/ijca2017914367

@article{ 10.5120/ijca2017914367,
author = { Himani Saraswat, Neeta Sharma, Abhishek Rai },
title = { Enhancing the Traditional File System to HDFS: A Big Data Solution },
journal = { International Journal of Computer Applications },
issue_date = { Jun 2017 },
volume = { 167 },
number = { 9 },
month = { Jun },
year = { 2017 },
issn = { 0975-8887 },
pages = { 12-14 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume167/number9/27799-2017914367/ },
doi = { 10.5120/ijca2017914367 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T00:14:23.598452+05:30
%A Himani Saraswat
%A Neeta Sharma
%A Abhishek Rai
%T Enhancing the Traditional File System to HDFS: A Big Data Solution
%J International Journal of Computer Applications
%@ 0975-8887
%V 167
%N 9
%P 12-14
%D 2017
%I Foundation of Computer Science (FCS), NY, USA
Abstract

We are in the twenty-first centuries also known as the digital era, where each and every thing generates a data whether it’s a mobile phone, signals, day to day purchasing and many more. This rapidly increases in amount of data; Big data has become a current and future frontier for researchers. In big data analysis, the computation is done on massive heap of data sets to extract intelligent, knowledgeable and meaningful data and at the same time the storage is also readily available to support the concurrent computation process. The Hadoop is designed to meet these complex but meaningful work. The HDFS (Hadoop Distributed File System) is highly fault-safe and is designed to be deployed on low cost hardware. This paper gives out the benefits of HDFS given to the large data set; HDFS architecture and its role in Hadoop.

References
  1. Tom White, “Hadoop The Definitive Guide”, 4th Edition 2015.3 by O’reilly.
  2. Alexdro Labrindis, H.V.Jagdish, Challenges and Opportunities with Big Data”, Proceedings of the VLDB Endowment,Vol.05,No.12,States.http://cra.org/ccc/docs/init/bigdatawhitepaper.pdf, Mar 2012.
  3. Dr. PelleJakovitis, Reducing Scientific Computing, Master Thesis, University of Tartu,2010.
  4. Girish Prasad Patro, “A Novel Approach for Data Encryption in Hadoop”, Department of Computer Science and Engineering National Institute of Technology Rourkela, www.nitrkl.ac
  5. Puneet Singh Duggal,Sanchita Paul,” Big Data Analysis: Challenges and Solutions. International Conference on Cloud, Big Data and Trust 2013, Nov 13-15, RGPV, At RGPV,Bhopal, India
  6. Gloria-Phillips-Wren,“Business Analytics in the Context of Big Data: A Roadmap for Research”, Loyola University-Maryland,
  7. Hadoop 2 vs. Hadoop 1 [Image] http://www.tomsitpro.com/articles/hadoop-2-vs-1,2-718.html
  8. HDFS Architecture Guide [Image] https://hadoop.apache.org/docs/r1.2.1/hdfs_design.html
Index Terms

Computer Science
Information Sciences

Keywords

BigData HDFS clusters Nodes Hadoop Architecture.