Hadoop storage big data layer: Meta-modeling of key concepts and features

12Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.

Abstract

In the era of information, humanity produces huge quantities of data measured in terms of terabytes or petabytes that is yet growing exponentially with time. This situation led to the emergence of a large number of big data systems and technologies that share similar architectures but with different implementations. The common architecture is composed of Data sources, Ingestion, Visualization, Hadoop Platform management, Hadoop Storage, Hadoop Infrastructure, Security, and Monitoring Layers. In our way for a unified abstract implementation, we proposed in a previous work a meta-model for data sources and ingestion layers. We relied on our previous comparatives studies to define key concepts of storage in Big Data to propose a meta-model for storage layer. Thus, in this paper, we are going to present our meta-model for storage layer. The main goal of this universal meta-modeling is to enable Big Data distribution providers to offer standard and unified solutions for a Big Data system.

Cite

CITATION STYLE

APA

Erraissi, A., & Belangour, A. (2019). Hadoop storage big data layer: Meta-modeling of key concepts and features. International Journal of Advanced Trends in Computer Science and Engineering, 8(3), 646–653. https://doi.org/10.30534/ijatcse/2019/49832019

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free