We now know that quantum mechanics has been a fundamental structure of our world since the universe came into being. However, it has been only a century since the experimental and theoretical discoveries of quantum mechanics were made. We are becoming increasingly aware of its many implications and applications. In particular, there are implications across many disciplines that most likely will affect education, health, and security. Examples are given of the need to start education as early as possible in schools, the use of nano-robots to deliver drugs targeted to specific molecular sites, and to developing new cryptographic systems to safeguard our privacy.
Top1. Introduction
As seen by differences in two reviews of this chapter, in the space permitted and without introducing equations, it is likely not possible to make this presentation both easily readable and detailed for most readers. The choice of examples are those of the author, and care has been taken to offer readability and depth at level to convey the importance of the Quantum World to all readers.
While some attribute the birth of computers to Charles Babbage circa 1837 since he designed his Analytic Engine with a rather full set of instructions, that machine was never actually constructed, e.g., see https://en.wikipedia.org/wiki/Analytical_Engine .
The history of computers is relatively new, less than 90 years old, within the lifetime of some readers of this book! Clearly, humans have been quite busy during this time, developing powerful computers to replace pen/pencil and paper.
An outline of the short (in time only) history of computers includes:
- ▪
The mathematical foundations of computing are generally attributed to Turing in 1936 (Turing, 1937).
- ▪
The first computer is usually considered to be the ENIAC, completed in 1946.
- ▪
The first quantum algorithm that has had far-reaching implications is mostly attributed to Shor in 1994 (Shor, 1994).
- ▪
The physical-mathematical foundations of quantum computing are generally attributed to Benioff in 1980 (Benioff, 1980).
- ▪
The initial history of quantum computers per se is usually attributed to Feynman in 1982 (R. Feynman, 1982).
- ▪
The first commercially available quantum computer is generally considered to be made by D-WAVE in 2017, building on a small 2-qubit quantum computer built circa 1997-1998.
The basic unit of a quantum computer is a qubit (quantum bit), the quantum analog of a classical-computer bit. A classical bit can be a state 0 or 1, whereas a qubit can be in a linear combination, called a superposition, of 0 and 1. For a classical computer with 2 bits, there are 4 possible states, but only one state can be realized at any given time. For a quantum computer with 2 qubits, there are 4 possible states, all of which can be realized simultaneously, and “superposition” of all linear combinations are possible. There are multiple theories for testing, in some cases offering radically different interpretations of reality based on quantum mechanics, e.g., https://www.scientificamerican.com/article/this-twist-on-schroedingers-cat-paradox-has-major-implications-for-quantum-theory/ .
When such a quantum state is subject to decoherence, e.g., by a classical measurement process, it becomes a classical state, wherein it becomes just one classical state. This decoherence is at the heart of why it is so difficult to build a quantum computer, as any classical perturbation causes quantum states to collapse into classical states. Decoherence still is referred to by many people as the “collapse” of the wave-function; however, even some of the issues reported here are not well explained using the term collapse.
1.1. Factorization of Large Numbers Using Shor's Algorithm
Most people who read about quantum computers associate the technology with new methods of computation that will be game changers for security. That is because many articles have been written on how this new technology will change the way just about all major passwords are encrypted using factorization of large numbers.