The basic goal of data analysis is to establish a link between a set of measurements, in the formof electronically stored data using some format, and a theoretical model, which is intended to describe the phenomena at the origin of thesemeasurements and usually is summarized by a set of equations with some parameters.The key elements of data analysis are data abstraction and data reduction. Abstraction means that the original set of raw measurements, e.g., a collection of electronic pulses induced by a particle passing through a detector, is converted (“reconstructed”) into physical quantities and properties which can be assigned to the particle, such as its momentum or its energy. Typically, this process is accompanied by data reduction, i.e., the overall data volume is reduced when going from the original set of measurements to a compilation of reconstructed physical quantities. In this chapter, I will describe the steps involved in order to achieve the abovementioned data abstraction and reduction in the case of Particle Physics experiments. Examples will be given for measurements carried out at e+e- as well as hadron colliders. The basic concepts behind reconstruction algorithms, such as track finding in tracking detectors and energy measurements in calorimeters, will be discussed, along with higher-level algorithms such as particle-jet reconstruction. Finally, the typical software and computing environment of large collider experiments will be described, which is necessary in order to achieve the outlined goals of data analysis.
CITATION STYLE
Dissertori, G. (2012). Data analysis. In Handbook of Particle Detection and Imaging (pp. 84–101). Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-642-13271-1_4
Mendeley helps you to discover research relevant for your work.