Dealing with Small Files Problem in Hadoop Distributed File System

The usage of Hadoop has been increasing greatly in recent years. Hadoop adoption is widespread. Some notable big users such as Yahoo, Facebook, Netflix, and Amazon use Hadoop mainly for unstructured data analysis as Hadoop framework works very well with structured and unstructured data. Hadoop distr...

Full description

Saved in:
Bibliographic Details
Published in:Procedia computer science Vol. 79; pp. 1001 - 1012
Main Authors: Bende, Sachin, Shedge, Rajashree
Format: Journal Article
Language:English
Published: Elsevier B.V 2016
Subjects:
ISSN:1877-0509, 1877-0509
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The usage of Hadoop has been increasing greatly in recent years. Hadoop adoption is widespread. Some notable big users such as Yahoo, Facebook, Netflix, and Amazon use Hadoop mainly for unstructured data analysis as Hadoop framework works very well with structured and unstructured data. Hadoop distributed file system (HDFS) is meant for storing large files but when large number of small files need to be stored, HDFS has to face few problems as all the files in HDFS are managed by a single server. Various methods have been proposed to deal with small files problem in HDFS. This paper gives comparative analysis of methods which deals with small files problem in HDFS.
ISSN:1877-0509
1877-0509
DOI:10.1016/j.procs.2016.03.127