From photons to big-data applications: Terminating terabits

6Citations
Citations of this article
23Readers
Mendeley users who have this article in their library.

Abstract

Computer architectures have entered a watershed as the quantity of network data generated by user applications exceeds the data-processing capacity of any individual computer end-system. It will become impossible to scale existing computer systems while a gap grows between the quantity of networked data and the capacity for per system data processing. Despite this, the growth in demand in both task variety and task complexity continues unabated. Networked computer systems provide a fertile environment in which new applications develop. As networked computer systems become akin to infrastructure, any limitation upon the growth in capacity and capabilities becomes an important constraint of concern to all computer users. Considering a networked computer system capable of processing terabits per second, as a benchmark for scalability, we critique the state of the art in commodity computing, and propose a wholesale reconsideration in the design of computer architectures and their attendant ecosystem. Our proposal seeks to reduce costs, save power and increase performance in a multi-scale approach that has potential application from nanoscale to data-centre-scale computers.

Cite

CITATION STYLE

APA

Zilberman, N., Moore, A. W., & Crowcroft, J. A. (2016). From photons to big-data applications: Terminating terabits. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 374(2062). https://doi.org/10.1098/rsta.2014.0445

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free