What Is a Qubit?

  • Hughes C
  • Isaacson J
  • Perry A
  • et al.
N/ACitations
Citations of this article
14Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In classical computers, information is represented as the binary digits 0 or 1. These are called bits. For example, the number 1 in an 8-bit binary representation is written as 00000001. The number 2 is represented as 00000010. We place extra zeros in front to write every number with 8-bits total, which is called one byte. In fact, every classical computer translates these bits into the human readable information on your electronic device. The document you read or video you watch is encoded in the computer binary language in terms of these 1’s and 0’s. Computer hardware understands the 1-bit as an electrical current flowing through a wire (in a transistor) while the 0-bit is the absence of an electrical current in a wire. These electrical signals can be thought of as “on” (the 1-bit) or “off” (the 0-bit). Your computer then decodes the classical 1 or 0 bits into words or videos, etc.

Cite

CITATION STYLE

APA

Hughes, C., Isaacson, J., Perry, A., Sun, R. F., & Turner, J. (2021). What Is a Qubit? In Quantum Computing for the Quantum Curious (pp. 7–16). Springer International Publishing. https://doi.org/10.1007/978-3-030-61601-4_2

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free