Massive Semantic Web data compression with MapReduce

29Citations
Citations of this article
45Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The Semantic Web consists of many billions of statements made of terms that are either URIs or literals. Since these terms usually consist of long sequences of characters, an ef- fective compression technique must be used to reduce the data size and increase the application performance. One of the best known techniques for data compression is dictionary encoding. In this paper we propose a MapReduce algorithm that eficiently compresses and decompresses a large amount of Semantic Web data. We have implemented a prototype using the Hadoop framework and we report an evaluation of the performance. The evaluation shows that our approach is able to eficiently compress a large amount of data and that it scales linearly regarding the input size and number of nodes. Copyright 2010 ACM.

Author supplied keywords

Cite

CITATION STYLE

APA

Urbani, J., Maassen, J., & Bal, H. (2010). Massive Semantic Web data compression with MapReduce. In HPDC 2010 - Proceedings of the 19th ACM International Symposium on High Performance Distributed Computing (pp. 795–802). https://doi.org/10.1145/1851476.1851591

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free