Hadoop on a low-budget general purpose hpc cluster in academia

0Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In the last decade, we witnessed an increasing interest in High Performance Computing (HPC) infrastructures, which play an important role in both academic and industrial research projects. At the same time, due to the increasing amount of available data, we also witnessed the introduction of new frameworks and applications based on the MapReduce paradigm (e.g., Hadoop). Traditional HPC systems are usually designed for CPU- and memory-intensive applications. However, the use of already available HPC infrastructures for data-intensive applications is an interesting topic, in particular in academia where the budget is usually limited and the same cluster is used by many researchers with different requirements. In this paper, we investigate the integration of Hadoop, and its performance, in an already existing low-budget general purpose HPC cluster characterized by heterogeneous nodes and a low amount of secondary memory per node.

Author supplied keywords

Cite

CITATION STYLE

APA

Garza, P., Margara, P., Nepote, N., Grimaudo, L., & Piccolo, E. (2014). Hadoop on a low-budget general purpose hpc cluster in academia. In Advances in Intelligent Systems and Computing (Vol. 241, pp. 187–192). Springer Verlag. https://doi.org/10.1007/978-3-319-01863-8_21

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free