Bell’s Theorem, Bell Inequalities, and the “Probability Normalization Loophole”

2Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Fifty years ago in 1964, John Bell [6], showed that deterministic local hidden-variables theories are incompatible with quantum mechanics for idealized systems. Inspired by his paper, Clauser, Horne, Shimony and Holt (CHSH) [12] in 1969 provided the first experimentally testable Bell Inequality and proposed an experiment to test it. That experiment was first performed in 1972 by Freedman and Clauser [20]. In 1974 Clauser and Horne (CH) [13] first showed that all physical theories consistent with “Local Realism” are constrained by an experimentally testable loophole-free Bell Inequality—the CH inequality. These theories were further clarified in 1976–1977 in “An Exchange on Local Beables”, a series of papers by Bell, Shimony, Horne, and Clauser [8] and by Clauser and Shimony (CS) [15] in their 1978 review article. In 2013, nearly fifty years after Bell’s original 1964 paper [6], two groups, Giustina et al. [24] and Christensen et al. [11] have finally tested the loophole-free CH inequality. Clauser and Shimony (CS) [15] also showed that the CHSH inequality is testable in a loophole-free manner by using a “heralded” source. It was first tested this way by Rowe et al. [35] in 2001, and more convincingly in 2008 by Matsukevich et al. [33]. To violate a Bell Inequality and thereby to disprove Local Realism, one must experimentally examine a two component entangled-state system, in a configuration that is analogous to a Gedankenexperiment first proposed by Bohm [9] in 1951. To be used, the configuration must generate a normalized coincidence rate with a large amplitude sinusoidal dependence upon adjustable apparatus settings. Proper normalization of this amplitude is critical for the avoidance of counterexamples and loopholes that can possibly invalidate the test. The earliest tests used the CHSH inequality without source heralding. The first method for normalizing coincidence rates without heralding was proposed by CHSH [12] in 1969. It consists of an experimental protocol in which coincidence rates measured with polarizers removed are used to normalize coincidence rates measured with polarizers inserted. Very high transmission polarizers are required when using this method. Highly reasonable and very weak supplementary assumptions by CHSH and by CH allow this protocol to work in a nearly loophole free manner. A second method for normalizing coincidence rates was offered by Garuccio and Rapisarda [22] in 1981. As will be discussed below, it allows experiments to be done more easily, but at a significant cost to the generality of their results. It was first used in the experiment by Aspect, Grangier, and Roge [3] in 1982. It uses “ternary-result” apparatuses and allows the use of highly absorbing polarizers, which would not work with other normalization methods. It normalizes using a sum of coincidence rates. Gerhardt et al. [23] in 2011 theoretically and experimentally demonstrated counterexamples for tests that use this normalization method. Their experiments thus obviate the validity of their counterexamples, and further indicate that very high transmission polarizers are necessary for convincing tests to be performed.

Cite

CITATION STYLE

APA

Clauser, J. F. (2017). Bell’s Theorem, Bell Inequalities, and the “Probability Normalization Loophole.” In Frontiers Collection (Vol. Part F919, pp. 451–484). Springer VS. https://doi.org/10.1007/978-3-319-38987-5_28

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free