Cost analysis for big geospatial data processing in public cloud providers

0Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Cloud computing is a suitable platform for running applications to process large volumes of data. Currently, with the growth of geographic and spatial data volume, conceptualized as Big Geospatial Data, some tools have been developed to allow the processing of this data efficiently. This work presents a cost-efficient method for processing geospatial data, optimizing the number of data nodes in a SpatialHadoop cluster according to dataset size. With this, it is possible to analyse and compare the costs for this type of application on public cloud providers.

Cite

CITATION STYLE

APA

Bachiega, J., Reis, M. S., Araújo, A. P. F., & Holanda, M. (2018). Cost analysis for big geospatial data processing in public cloud providers. In Communications in Computer and Information Science (Vol. 864, pp. 223–236). Springer Verlag. https://doi.org/10.1007/978-3-319-94959-8_12

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free