ATLAS Data Challenges in Grid environment on cyfronet cluster

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The LHC ATLAS experiment at CERN will produce 1.6 PB of data per year. The High Energy Physics analysis techniques require that corresponding samples of at least 2 PB of Monte Carlo simulated data are also required. Currently the Monte Carlo test production is performed, in steps called Data Challenges. Such production and analysis can be performed in distributed sites. The computing model should allow for central brokering of jobs and management of huge amounts of data. The Grid environment is a possible solution. Data Challenges have to prove reliability and usability of the Grid. Main effort is to use Grid as 'yet another job submission system'. Some tentative solutions are presented and some weaknesses of existing software are pointed out. Additionally, perspectives of further development and improvements are indicated. © Springer-Verlag 2004.

Cite

CITATION STYLE

APA

Bołd, T., Kaczmarska, A., & Szymocha, T. (2004). ATLAS Data Challenges in Grid environment on cyfronet cluster. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2970, 103–110. https://doi.org/10.1007/978-3-540-24689-3_13

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free