A conceptual framework for big data implementation to handle large volume of complex data

3Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Globally industries, businesses, people, government are producing and consuming vast amounts of data on daily basis. Now-a-days, it’s become challenging to the IT world to deal with the variety and velocity of large volume of data. To overcome these bottlenecks, Big Data is taking a big role for catering data capturing, organizing and analyzing process in innovative and faster way. Big Data software and services foster organizational growth by generating values and ideas out of the voluminous, fast moving and heterogeneous data and by enabling completely a new innovative Information Technology (IT) eco-system that have not been possible before. The ideas and values are derived from the IT eco-system based on advanced data-analysis on top of the IT Servers, System Architecture or Network and Physical objects virtualization. In this research paper, authors have presented a conceptual framework for providing solution of the problem where required huge volume of data processing using different BIG data technology stack. The proposed model have given solution through data capturing, organizing data, analyzing data, finally making value and decision for the concern stakeholders.

Cite

CITATION STYLE

APA

Sanyal, M. K., Bhadra, S. K., & Das, S. (2016). A conceptual framework for big data implementation to handle large volume of complex data. In Advances in Intelligent Systems and Computing (Vol. 433, pp. 455–465). Springer Verlag. https://doi.org/10.1007/978-81-322-2755-7_47

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free