The technology of quantum computing is based on the idea of extending classical information theory. This is done by combining it with the concept of quantum physics and thereby enhancing computer science, which has so far been reduced to a binary architecture, built on transistor systems. Hereby the theory of quantum information opens and enlightens the gap between zero and one by embracing the scale of all possibilities.
This idea is based on the theory of quantum optics that dates back to the end of the 19th century, when Max Planck modelled “blackbody radiation” (Planck 1899). From the assumption that the electro magnetic modes in a cavity were quantised in energy with the quantum energy equal to Planck’s constant times the frequency, Planck derived a radiation formula where the average energy per “mode” or quantum” is the energy of the quantum times the probability that it will be occupied. His formula in comparison to the classical Rayleigh-Jeans Law marks the basis of the explanation of why light becomes visible for the human eye.
This quantisation assumption then led Albert Einstein during his “annus mirabilis” in 1905 to the hypothesis of the existence of the particles of light, the so called photons and later in 1935 to the publication of the paradoxical thought experiment by Einstein, Podolsky and Rosen (EPR, 1935). The uncertainty principle by Werner Heisenberg that emerges from the double-slit experiment showing that an interference pattern from photons only shows up when there is no measurement, marks the beginning of a controversy where Albert Einstein asked “if the Moon wouldn’t be there if nobody looked” and “if god plays dice”. Controversial, because the validation of the experiment led to a debate between Nils Bohr, Max Planck and Albert Einstein, known as the Copenhagen interpretation. In the experiment, the random state of two entangled photons, one capturing the same condition when the other is changed, was refused by Einstein as “spooky action”. This correlation of entangled objects is defined as quantum entanglement or quantum non-local connection. It represents the core of the EPR paradox and the reason why quantum theory is seen as the most fundamental theory, giving an explanation about the stability of material.
The deep insight into quantum information theory came with John Bell’s analysis of the EPR publication, building a link between quantum mechanics and information theory. He noticed the importance of the correlation between separated quantum systems which have interacted (directly or indirectly) in the past, but which no longer influence one another. His argument showed that the degree of correlation, which can be present in such systems, exceeds what could be predicted on the basis of any law of physics which describes particles in terms of classical variables rather than quantum states. Bell’s argument was clarified later on by others and experimental tests were carried out in the 1970s (Steane 1998).
Qubits and Superposition
One of the pioneers in quantum computing is Richard Feynman, a Nobel-Prize winner in physics, who observed in the early 1980s that certain quantum mechanical effects cannot be simulated efficiently on a binary processing computer system. While Moore’s law describes the exponential growth of processing velocity in relation to the reduction of a processor size, this law might one day converge with the size of a qubit.
The qubit, or quantum-bit, is a unit of quantum information, where in contrast to the bit in computer information (one or zero), the qubit can be zero, one or a superposition of both (Schumacher 1995). Schrödinger explained the superposition phenomenon with his thought-experiment, as an answer to the EPR paradox, about a cat in a box that can exist in a superposition of life and death (Schrödinger, 1935).
From experiments to practice
The observation of Feynman led to the speculation that perhaps computation in general could be done more efficiently if it utilises these quantum effects (Bone and Castro 1997; Riefel and Polak 2000). However the quantum effects have been shown in experiments of tiny objects that depend on sophisticated equipment, but as observed by Riefel and Polak (2000, p.300), building a quantum computer has not been very easy: “Computational machines that use such quantum effects, proved tricky, and as no one was sure how to use the quantum effects to speed up computation, the field developed slowly. It was not until 1994, when Peter Shor surprised the world by describing a polynomial time quantum algorithm for factoring integers that the field of quantum computing came into its own. This discovery prompted a flurry of activity among experimentalists, trying to build quantum computers and theoreticians trying to find other quantum algorithms”.
Spooky action, teleportation and entanglement in practice
Nevertheless, in 1985 the British quantum physicist David Deutsch wrote a paper on Quantum Turing Machines and the first quantum algorithm (Deutsch 1985). For his work, Deutsch won the computation science price in 2005 and has therefore been seen as the founder of quantum computing. Since 2004 the concept or quantum teleportation trough entanglement has been proven in experiments at the University of Vienna in Innsbruck, Austria and on the Island of Tenerife, where a team of British and Austrian physicists transmitted entangled photons over 144 kilometers (Zeilinger, 2004 & 2007).
Out of the experiments, the scientists aim to build the fundament for practical quantum computation tasks such as quantum information storage, quantum calculation and quantum teleportation. In the future this might even offer ways and methods to decrypt the code of life, such as the DNA, where entanglement in bioinformatics is seen as a possible explanation to hold together the life’s blueprint (Ananthaswamy 2010).
Quantum computing is assumed not to be built for general-purpose devices, but for specific applications. Additionally, quantum teleportation will not be realised in the form of an artefact to teletransport large objects to offer a new way of mobility. Nonetheless, in general, any kind of applications are assumed to be possible and to be handled by quantum computing at some point in the future. However, the most promising application areas for the near future are understood to: “be purpose-built tools that exploit quantum rules to improve on existing technologies such as atomic clocks and photonic technology….” (Ball 2006).
On the other hand it has been stated, that on the longer run quantum computing can potentially have a revolutionary effect, especially in areas such as optimisation, code breaking, DNA and other forms of molecular modelling, large-database access, encryption, stress analysis for mechanical systems, pattern matching and data forecasting. One of the basic features behind quantum computers is that they are able to calculate upon absolute randomness, thus give more veracity to programs that need absolute randomness in their processing. Randomness plays a significant part in applications with a heavy reliance on statistical approaches, for simulations, for code making, randomised algorithms for problems solving. In comparison to absolute randomness, that is for example measured from the localisation of a photon, the common randomness, generated by a transistor based processor, is much weaker and can be re-calculated. The absolute randomness of a quantum computer is not re-calculatable and would even surprise the creator.
As a scientific research tool the quantum computer could have revolutionary impact because of its ability to simulate other quantum systems (Ball 2006). For example, a quantum computer can simulate physical processes of quantum effects in real time. Molecular simulations of chemical interactions will allow scientists to learn more about how their products interact with each other, and with biological processes (e.g. how a drug may interact with a person’s metabolism or disease).
Up until now encryption algorithms are based on keys, which are generated through large prime numbers. These keys, practically split up into a private and a public key-pair, are then used to encrypt and later decrypt data. When two parties, for example, Alice and Bob exchange sensitive messages, Alice uses her private key together with the public key of Bob, which are summarised into a large string to cipher the message. Bob then uses the public key of Alice together with his private key to regain the same string to decipher the message. This method is called public-key cryptography, whose safety relies on the (relatively weak) randomness and calculation speed of a common computer.
Quantum key distribution (QKD) offers the possibility to encrypt data without having to exchange public keys, because quantum entanglement would spare this necessity and more importantly, the encryption key would be based on an absolute quantum randomness. The information of the state of the key would be teletransported to Bob in real-time when Alice encrypts the message.
The fact that quantum computing offers absolute randomness could therefore jeopardise the encryption scheme of public-key based information. It is the encryption algorithm that today for example is used to secure bank transactions and encrypt the keys on credit cards. As quantum cryptography has been proved and tested in several experiments, this application could be introduced before the expiry of the commonly used public-key based application.
The task of calculating large prime numbers in zero time is seen as an easy job for a quantum computer.
“In 1993 an international group of six scientists, including IBM Fellow Charles H. Bennett, confirmed the intuitions of the majority of science fiction writers by showing that perfect teleportation is indeed possible in principle, but only if the original is destroyed. In subsequent years, other scientists have demonstrated teleportation experimentally in a variety of systems, including single photons, coherent light fields, nuclear spins, and trapped ions. Teleportation promises to be quite useful as an information processing primitive, facilitating long range quantum communication (perhaps ultimately leading to a “quantum internet”), and making it much easier to build a working quantum computer. But science fiction fans will be disappointed to learn that no one expects to be able to teleport people or other macroscopic objects in the foreseeable future, for a variety of engineering reasons, even though it would not violate any fundamental law to do so”.
Experiments with entangled photons have shown, that, due to the immediate change of state of entangled photons, time becomes obsolete. In relation to Albert Einstein’s relativity theory, which says that approaching the speed of light, the time passes slower and the state of quantum entanglement seems to be in a state where time does not go by any more. For a qubit now is the only moment existing, inheriting the future and the past. With quantum teleportation experiments, physicists are showing that teleportation can be done bypassing time.
Brain-Computer interfaces and Quantum robots
“The actual Brain-Computer Interface (Brain Gate) attempts to use brain signals to drive suitable actuators performing the actions corresponding to subject’s intention. However this goal is not fully reached, and when BCI works, it does only in particular situations. The reason of this unsatisfactory result is that intention cannot be conceived simply as a set of classical input-output relationships. It is therefore necessary to resort to quantum theory, allowing the occurrence of stable coherence phenomena, in turn underlying high-level mental processes such as intentions and strategies. More precisely, within the context of a dissipative Quantum Field Theory of brain operation it is possible to introduce generalised coherent states associated, within the framework of logic, to the assertions of a quantum language.” (Pessa & Zizzi 2009, p.1).
Quantum money and quantum memory
In 1970 Steven Wiesner came up with the concept of quantum money, an application where 20 photons would be trapped into bills, which then could be identified as a unique serial number and the bill could not be counterfeit. His idea was introduced in 1983, but remains fiction, as it only works in theory, because nobody until now has succeeded in building robust photon traps.
But just lately scientists at the National University of Australia have apparently managed to build a solid state quantum memory body. “The device allows the storage and recall of light more faithfully than is possible using a classical memory, for weak coherent states at the single-photon level through to bright states of up to 500 photons. For input coherent states containing on average 30 photons or fewer, the performance exceeded the no-cloning limit. This guaranteed that more information about the inputs was retrieved from the memory than was left behind or destroyed, a feature that will provide security in communications applications.”
Quantum features of consciousness
In 2000 the Mensky (2009) suggested an approach called Extended Everett’s Concept (EEC). Mensky claims that EEC offers the shortest line of consideration connecting quantum theory with consciousness. Mensky continues that: “consciousness, or rather complex consisting of explicit consciousness and super-consciousness (manifesting itself in the regime of unconscious), is a human’s ability providing the best possible orientation in the world. According to EEC, consciousness is not produced by brain, but is independent of it. The brain serves as an interface between conscious and the body” (Mensky 2009).
Mensky states that: “whole quantum world is a sort of quantum computer supporting the phenomenon of consciousness and super consciousness. Instead of being an origin of consciousness, real quantum computers can be used to construct models of quantum world demonstrating how the phenomena of life and consciousness may exist. Due to special features of human super consciousness, it cannot be replaced by the action of any technical device or even any material system. However, technical equipment may be used to make usage of super-consciousness more efficient.” (Mensky 2009).
Quantum Information Network
Looking at the history of the world wide web and the development since its proposal by Tim Berners-Lee in 1989, the information pool has grown up to an extent that finding the right data by the use of search engines has become a challenging task. The introduction of semantic web solutions have just introduced a meta level between the data and the user into descriptions and meta-descriptions generating an endless self-sustentative process ending in a Ourobouros-circle, a snake that eats its own tail. Governments and authorities are struggling to introduce barriers and the perishability of information, as many start to understand that “the internet never forgets”. The proposal of a quantum information network would introduce a new possibility of information handling, simultaneously with the solution that information would not have to be searched as it appears in the moment of demanding for it (Antener 2010).
Definition and Defining Features
A definition for understanding some of the very basic ideas behind quantum computing and the differences between so called traditional computing and quantum computing, is presented below:
Quantum computer basics: “In the classical model of a computer, the most fundamental building block, the bit, can only exist in one of two distinct states, a 0 or a 1. In a quantum computer the rules are changed. Not only can a ’quantum bit’, usually referred to as a ’qubit’, exist in the classical 0 and 1 states, it can also be in a coherent superposition of both. When a qubit is in this state it can be thought of as existing in two universes, as a 0 in one universe and as a 1 in the other. An operation on such a qubit effectively acts on both values at the same time. The significant point being that by performing the single operation on the qubit, we have performed the operation on two different values. Likewise, a two-qubit system would perform the operation on 4 values, and a three-qubit system on eight. Increasing the number of qubits therefore exponentially increases the ’quantum parallelism’ we can obtain with the system. With the correct type of algorithm it is possible to use this parallelism to solve certain problems in a fraction of the time taken by a classical computer” (Bone and Castro 1997).
“Imagine a macroscopic physical object breaking apart and multiple pieces flying off in different directions. The state of this system can be described completely by describing the state of each of its component pieces separately. A surprising and unintuitive aspect of the state space of an n-particle quantum system is that the state of the system cannot always be described in terms of the state of its component pieces. It is when examining systems of more than one qubit that one first gets a glimpse of where the computational power of quantum computers could come from.” (Rieffel and Polak 2000).
Quantum computation is strongly seen to efficiently solve some of the most difficult problems in computational science and in a way change dramatically the development and implementation of information and communication systems of the future (such as integer factorisation, discrete logarithms, and quantum simulation and modelling that are intractable on any present or future conventional computer).
-Much faster computing for special purposes: quantum computing may offer us new tools for different field of science or research.
-Simulation of various phenomena which equals understanding of phenomena: it is expected that quantum computing may support new kind of simulations (e.g. nanoscale or macroscale) and due to this may also increase understanding of phenomena still not understood
-Totally new way of computing: new kind of paradigm in computing may require totally new skills
-Revealing mysteries of world in general: quantum computing and physics is expected to increase our understanding in the whole about how the world or human mind are constructed.
Quantum Computing has been actively researched for around 20 years although many theoretical concepts and ideas are 100 years old. While elementary applications of quantum computing exist, it will take around 5-10 years for quantum computers to be introduced for wide common use, even as some commercial actors from the field of semantic-system applications and search engines start claiming to offer such solutions.
Relation to other Technologies
Artificial intelligence and Robotics
The theories of quantum computation have some implications in the world of artificial intelligence. The debate about whether a computer will ever be able to be truly artificially intelligent has been going on for years and has largely been based on philosophical discussion. There is a strong statement that the human mind does things that are not, even in principle, possible to perform on a Turing machine. But the theory of quantum computation looks at the question of consciousness from a different perspective. The quantum computer theory or in any case some of the developers of the theory state that every physical object (from a rock to the universe as a whole) can be regarded as a quantum computer and that any detectable physical process can be considered a computation. Under these criteria, it has been argued, that the brain can be regarded as a computer and consciousness as a computation. Bone and Castro (1997) state, that: “next stage of the argument is based in the Church-Turing principle” while every computer is functionally equivalent and that any given computer can simulate any other. So finally it should be possible to simulate conscious rational thought using a quantum computer. For some quantum computing offers the solution to solve the problem of artificial intelligence but many also disagree. It has been stated that an even more exotic and as yet unknown physics may be required to understand the human consciousness (Bone & Castro 1997; Mensky 2009; Pessa & Zizzi 2009).
It has been stated that while quantum computing is a new emerging technological field, it has potential to change very dramatically the way we think about computation, programming, or complexity. New computing paradigm may have lots of critical issues. Quantum computing actually is in a exciting research phase, where not too many possible critical issues have been reported. On the other hand Quantum computing may be utilised as an enabling technology for the development of for example, artificial intelligence or robotics so the same critical issues may then be confronted at application level (e.g. autonomy and rights).
Ananthaswamy, A. (2010). Quantum entanglement holds together life’s blueprint – life – 15 July 2010 – New Scientist.
Zeilinger, A., Aspelmeyer, M. & Brukner, C. (2004). Entangled photons and quantum communication.
Ball, P. (2006) Champing at the bits. Nature, Vol 440, 23 March 2006
Bone, S and Castro, M. (1997) A Brief History of Quantum Computing. http://www.doc.ic.ac.uk/~nd/surprise_97/journal/vol4/spb3/
Bouwmeester, D., Mattle, K., Pan, J., Weinfurter, H., Zeilinger, A., Eibl, M., et al. (1998). Experimental quantum teleportation of arbitrary quantum states. Applied Physics B: Lasers and Optics, 67(6), 749-752. doi: 10.1007/s003400050575.
Deutsch, D. (1985). Quantum Theory, the Church-Turing Principle and the Universal Quantum Computer. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 400(1818), 97-117. doi: 10.1098/rspa.1985.0070.
Einstein, A., Podolsky, B., & Rosen, N. (1935). Thought Experiment. Physics Letters A (Vol. 153, pp. 279-284). doi: 10.1016/0375-9601(91)90943-3.
Haug, F., Freyberger, M., Vogel, K., & Schleich, W. P. (1927). 5 . 1 Quantum optics.
Mensky, M.B. (2009). Quantum features of consciousness, computers and brain. The 9th WSEAS International Conference on Applied Computer Science (ACS’09), Genova, Italy, October 17-19, 2009
Mukunda, N. (2008). Max Planck – Founder of quantum theory. Resonance, 13(2), 103-105. doi: 10.1007/s12045-008-0026-9.
Pessa, E. and Zizzi, P. (2009) Brain-Computer Interfaces and Quantum Robots. Physical and Cognitive Mutations in Humans and Machines”, Laval (France), 24-25 April 2009
Planck, M. (1899). Über irreversible Strahlungsvorgänge : fünfte Mitteilung (Schluss). Berlin: Reimer
Poppe, A., Fedrizzi, A., Ursin, R., Böhm, H., Lörunser, T., Maurhardt, O., et al. (2004). Practical quantum key distribution with polarization entangled photons. Optics express, 12(16), 3865-71.
A Quantum Information Science and Technology Roadmap Part 1: Quantum Computation Report of the Quantum Information Science and Technology. Experts Panel (2004)
Riefel, E and Polak, W (2000): An Introduction to Quantum Computing for Non-Physicists. ACM Computing Surveys, Vol. 32, No. 3, September 2000, pp. 300–335.
Schumacher, B. (1995). Quantum coding. Physical Review A, 51(4), 2738 LP – 2747. American Physical Society.
Schrödinger, E “Die gegenwärtige Situation in der Quantenmechanik”, Naturwissenschaften 23: pp.807-812; 823-828; 844-849 (1935).
Steane, A. (1998). Quantum computing. Rep. Prog. Phys. 61 (1998) 117–173.
Zeilinger, A. (2006). Quantum Computation and Quantum Communication with Entangled Photons. Quantum optics, (Figure 2), 10. doi: 10.1038/nphys629.
Institute for theoretical physics at Vienna University, Austria (http://quanten.at)
Max Planck Institute for physics, Munich, Germany (http://www.mpp.mpg.de/)
Theoretische Quantenphysik Technische Universität Darmstadt, Germany (http://www.iap.tu-darmstadt.de/tqp)
Physics of Information / Quantum Information Group at IBM Research Yorktown (http://www.research.ibm.com/physicsofinfo)
Oxford Center for Quantum Computation, UK. (http://www.qubit.org)
Oxford Foundations of Computer Science Research Group, UK (http://se10.comlab.ox.ac.uk:8080/FOCS/PhysicsandCS_en.html)
Oxford Mathematical Physics Group, UK (http://www.maths.ox.ac.uk/groups/mathematical-physics)
MIT Quantum Information Science, USA (http://qis.mit.edu)
Institute for Quantum Computation, Canada (http://www.iqc.ca/)
University of Leeds Quantum Information group, UK. (http://www.qi.leeds.ac.uk)
The Cambridge Center for Quantum Computation, UK (http://cam.qubit.org)