Von neumann normalisation of a quantum random number generator

7Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

Abstract

In this paper we study von Neumann un-biasing normalisation for ideal and real quantum random number generators, operating on finite strings or infinite bit sequences. In the ideal cases one can obtain the desired un-biasing. This relies critically on the independence of the source, a notion we rigorously define for our model. In real cases, affected by imperfections in measurement and hardware, one cannot achieve a true un-biasing, but, if the bias 'drifts sufficiently slowly', the result can be arbitrarily close to un-biasing. For infinite sequences, normalisation can both increase or decrease the (algorithmic) randomness of the generated sequences. A successful application of von Neumann normalisation - in fact, any un-biasing transformation - does exactly what it promises, un-biasing, one (among infinitely many) symptoms of randomness; it will not produce 'true' randomness.

Cite

CITATION STYLE

APA

Abbott, A. A., & Calude, C. S. (2012). Von neumann normalisation of a quantum random number generator. Computability, 1(1), 59–83. https://doi.org/10.3233/COM-2012-001

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free