Breaking

Tuesday, November 13, 2018

What a Quantum Computer Is, and Why It Needs to Be More

It would be the harbinger of an entirely new medium of calculation, harnessing the inexplicable powers of subatomic particles to obliterate the barriers of time in solving incalculable problems. Your part in making it happen may simply be to convince yourself that black is white and up is down.


A quantum computer is -- or, perhaps more accurately phrased, would be -- a wholly different order of mechanism than anything the human species has ever constructed. Today, there are working machines that perform some small part of what a full quantum computer may eventually do. Depending upon whom you ask, these are either quantum computing prototypes or "prologues" -- stepping stones toward the real thing.

The goal of quantum computing research is to discover a means of expediting the execution of long waves of instructions. Such a means would exploit an observed phenomenon of quantum mechanics that, when you write it down on paper, doesn't appear to make sense.

If this goal is achieved -- if everything that physicists are certain works theoretically, ends up working in the real world -- then mathematical problems that require days' worth of calculation even on today's supercomputers, and some that are not solvable even now, may be solved instantaneously. Climate change models, estimates of the likelihood of Earth-type planets in the observable galaxy, models of the immune system's capability to destroy cancer cells -- the most difficult and challenging problems we face today may suddenly yield results within no longer than an hour after launching the program.

Granted, these results may not come in the form of a complete solution, but instead, a probability table pointing to the most likely solutions. But even such probabilities, up to now, have been unattainable even with the highest-performing supercomputers on the planet.

What quantum computing would accomplish

If you've ever programmed an Excel macro, you've personally experienced the following: You add input rows to the bottom of a worksheet whose columns serve as inputs for a long formula. Each time the formula recalculates, the time consumed is longer and longer. If you're on a slow enough computer, you can witness this phenomenon for yourself: As the number of input rows grows linearly, the time consumed by the macro grows exponentially.

If you've ever written a program for a supercomputer, you've witnessed exactly the same phenomenon. The scale may be different, but the effect is the same. And if you read through the supercomputer's logs, you can verify this observation personally. There's a point in time in which every algorithm, no matter how simple, simply becomes unworkable on account of the overwhelming weight of its input data.

This is the phenomenon that quantum computing would eliminate. A fully-functional quantum computing would become more capable exponentially by scaling its computing capacity linearly. As a result, for each increase in the number of steps in a quantum algorithm, the amount of time consumed during execution increases by a smaller amount, until eventually, the time gap between exponentially different workloads becomes so small as to be immeasurable.


Prof. John Preskill, Caltech

"What it means is that the difference between hard and easy problems," explained John Preskill, the Feynman Professor of Theoretical Physics at Caltech, during a 2017 speech, "the difference between problems we'll be able to solve someday with advanced technologies, and the problems that we'll never be able to solve because they're just too hard -- that boundary between 'hard' and 'easy' is different than it otherwise would be, because this is a quantum world, not a classical world."

THE QUANTUM TRADEOFFS

To be very clear: It would be inaccurate to say that a quantum computing runs programs faster than a PC or an x86 server. A "program" for a quantum computing is a very different order of beast than anything ever produced for a binary processor. The translation between a mathematical problem intelligible by college professors into a binary program, and the translation between the same problem into a quantum computing program are as different from one another as "20 Questions" is from billiards.

There are several fundamental compromises when you move into the realm of quantum computing. Here's one that's daunting just by itself: Solutions will rarely be exact or definitive. A quantum computing is not a deterministic machine; in other words, there is no singular solution for which any other result would be an error. Instead, a quantum computing will tend to render sets of answers with their respective probabilities.

If that doesn't discourage you, get prepared for this: The atom-level device that actually performs the quantum calculations will, as a result of its work and as is its nature, self-destruct when it's done. A quantum computing mechanism would actually be a machine that automatically builds the computing device out of atoms (calcium atoms are good candidates), sustains the operating conditions of that device for the duration of its program, applies the program, allows it to execute, looks the other way (because quantum logic gates are shy and will explode if anyone sees them), interprets the final state of its registers as the final probability table of results, then resets itself to rebuild another mechanism all over again.

Imagine if Alan Turing's incredible machine that cracked the Nazi "Enigma" code, was guaranteed to explode after every run. (quantum computing engineers prefer the term "collapse," but let's call it what it is: Explode.) And if Turing, an ingenious engineer, devised an automated manufacturing operation that rebuilt that machine out of new parts, each and every day. Every quantum computer engineer has done more than imagine such a scheme but built a plan for such a device on the quantum scale. Indeed, such hypothetical "on paper" schemes are called "Turing machines." Quantum engineers believe their computers can and will work because their Turing machine experiments give them cause for faith.

What a quantum computer may be good for

Are there real-world applications of quantum computing technology, or some derivative of it, that people are putting to good use right now? Put another way, what does quantum actually do, and whom does it directly serve?


  • Navigation: A GPS system cannot work everywhere on the planet, particularly underwater. A quantum computing requires atoms to be supercooled and suspended in a state that renders them particularly sensitive. In an effort to capitalize on this, competing teams of scientists are racing to develop a kind of quantum accelerometer that could yield very precise movement data. One promising effort to that end comes from France's Laboratoire de Photonique Numérique et Nanosciences: An effort to build a hybrid component that pairs a quantum accelerometer with a classical one, then uses a high-pass filter to subtract the classical data from the quantum data. The result, if realized, would be an extremely precise quantum compass that would eliminate the bias and scale factor drifts commonly associated with gyroscopic components.
  • Seismology: That same extreme sensitivity may also be exploited to detect the presence of oil and gas deposits, as well as potential seismic activity, in places where conventional sensors have to date been unable to explore. This according to QuantIC, the quantum imaging technology hub led by the University of Glasgow. In July 2017, working with commercial photonics tools provider M Squared, QuantIC demonstrated how a quantum gravimeter detects the presence of deeply hidden objects by measuring disturbances in the gravitational field. If such a device becomes not only practical but portable, the team believes it could become invaluable in an early warning system for predicting seismic events and tsunamis.
  • Pharmaceuticals: At the leading edge of research into tackling diseases such as Alzheimer's and multiple sclerosis, scientists have been utilizing software that models the behavior of artificial antibodies at the molecular level. Last year, neuroscience firm Biogen began partnering with IT consultancy Accenture and quantum computing research firm 1QBit to frame a new molecular simulation model in such a way that it can be executed on classical platforms, as well as present and future quantum platforms. One methodology developed by 1QBit's researchers involves translating traditional molecular diagrams into graphs full of dots, lines, and curves that, while seemingly more confusing on the surface, map more directly to a quantum model of vectors and relationships.
  • Now to the more controversial question: Assume someone built a mechanism that successfully leaps over the hurdles imposed by quantum physics, producing a full quantum computer capable of performing all the tasks currently relegated to the realm of theory and simulation. What do experts in this field think a quantum computer should be able to do, assuming every phenomenon that physicists have theorized and that scientists have observed and verified, is ultimately exploitable?



Prof. Richard Feynman, Caltech, approx. 1983

Physics: This one should be obvious enough. It's actually the reason for the concept's very existence. During a 1981 speech at Caltech, Prof. Richard Feynman, the father of quantum electrodynamics (QED), suggested that the only way to build a successful simulation of the physical world at the quantum level would be with a machine that obeyed the laws of quantum mechanics. It was during that speech that Prof. Feynman explained, and the rest of the world came to realize, that it would not be enough for a computer to generate a probability table and, as it were, roll dice. Moreover, it would take a mechanism that behaved along the same lines as the behavior it would purport to simulate, to produce results that physicists themselves wouldn't dismiss as apocryphal.

Machine learning: If and when quantum computers ever become stable enough to support thousands of qubits, algorithms for machine learning are standing by, having been thoroughly tested on paper and in simulators. The basic theory among proponents is that quantum systems may be geared to "learn" patterns of states in huge, concurrent waves rather than successive, sequential scans. If you were awake for the preceding paragraph, you already know what the problem is here: Paper, like electronic computers, is a classical system. Conventional mathematics can circumscribe a set of probable quantum outcomes, as vectors in a wildly configurational space. Yet it cannot -- as Richard Feynman made clear from the very beginning -- simulate how those outcomes may be attained. The first signs of general doubt among experts that quantum machine learning may even be possible were gently seeded into a report from MIT last October on a panel convened with IBM, where experts admitted that even after quantum computing become reality, several more years may pass before enough stable qubits make quantum machine learning feasible.

Decryption: Here, at last, is the breakthrough that cast the first bright spotlight on quantum computing. What makes encryption codes so difficult even for modern classical computers to break is the fact that they're based on factors of extremely large numbers, requiring inordinate amounts of time to isolate by "brute force." An operational quantum computing should isolate and identify such factors in mere moments, rendering the RSA system of encoding effectively obsolete. In 1994, MIT Professor Peter Shor devised a quantum algorithm for factoring values, which experimenters building low-qubit quantum systems have already tested successfully, albeit with rather small quantities. When large-qubit quantum computing are successfully built, few doubt the power of Shor's Algorithm to knock down all current public key cryptography.

Encryption: But herein, some say, lies an opportunity: A concept called quantum key distribution (QKD) holds out the theoretical hope that the kinds of public and private keys we use today to encrypt communications may be replaced with quantum keys that are subject to the effects of entanglement. Theoretically, any third party breaking the key and attempting to read the message would immediately destroy the message for everyone. Granted, that may be enough mischief right there. But the theory of QKD is based on a huge assumption which has yet to be tested in the real world: That values produced with entangled qubits are themselves entangled and subject to quantum effects wherever they go.

Who is in the race to build quantum computers?

Despite certain people's best efforts, the modern economy remains global. The laboratories, universities, and manufacturers with an interest in quantum have their own interests across the globe. So there is no genuine country-versus-country "arms race" to build the first complete quantum computing.

One private firm with real contracts, including with US Government agencies, that produces devices that perform one form of quantum computing, called quantum annealing, is D-Wave Systems Inc. Today, D-Wave produces a commercial system which it claims is capable of sustaining 2,048 qubits -- substantially greater than other researchers claim thus far. While some continue to openly dispute this claim (specifically, they cast doubt on the "quantum-ness" of its results) it's worth noting that D-Wave's partners in the Quantum Artificial Intelligence Laboratory (QuAIL) are NASA and Google; and its partners in the Quantum Computation Center (QCC) are Lockheed Martin and the University of Southern California.

Microsoft participates in quantum research laboratories worldwide, including areas in which you wouldn't think Microsoft would have an interest, such as materials for quantum computing substrates. The company funds and actively support quantum computing research through its Quantum Architectures and Computation (QuArC) group. To promote the concepts of quantum algorithms, in December 2017, Microsoft released a quantum simulator and development kit, complete with a domain-specific programming language called Q#, all of which are freely downloadable and may be integrated with Visual Studio or VS Code.



IBM's Think Q team working at its Watson Research Lab in New York. (Image: IBM)
IBM lays a valid claim to having built several functional quantum processing devices, though limited at present to a 20-qubit array at best. Like Microsoft, IBM offers an open source developers' kit called Qiskit, and invites individuals to experiment with producing quantum algorithms using its 32-qubit simulator. Its plan for 2019 is to conduct experiments with constructing quantum computing at its Thomas J. Watson Laboratory [shown above], using experimental materials recently synthesized by researchers at Princeton University and the University of Wisconsin.


The socket end of an Intel prototype quantum computing chip. (Image: Intel)

Intel has been working to fabricate quantum computing devices like the 17-qubit prototype at left, using processes that would not be significantly different from fabricating conventional superconductors. The catch is that Intel would seek to replace the conventional model of the qubit, which is superconductive and thus requires supercooling, with a more temperature-tolerant alternative it calls a spin qubit. Last June, at the company's D1D fabrication facility just outside of Portland, Oregon, it produced a test chip it claims is capable of sustaining qubits at the much milder temperature of -460 degrees Fahrenheit. Such a chip cannot yet, however, be considered a full quantum processor.

In April 2016, the European Union launched a project it calls Quantum Technologies Flagship, with the aim of boosting quantum computing research and development throughout Europe. Last October, as part of this effort, the Flagship announced the start of some 20 related projects for this effort, including one called the Quantum Internet Alliance (QIA). Its goal is no less than the conceptualization of a fully entangled global network, theoretically enabling the instantaneous transmission of qubits between repeating stations.

What a quantum computer probably is

The word "computer" here has a very basic context -- not a handheld device or a cooled server with a processor and memory. Think of a computer the way Charles Babbage or John von Neumann considered it: As a mechanism guaranteed to deliver a certain output given a specific set of inputs and a defined configuration. At the deepest microscopic levels of a modern microprocessor, one logic unit is what these fellows would have called a computer.

BITS AND QUBITS

Every classical electronic computer exploits the natural behavior of electrons to produce results in accordance with Boolean logic (for any two specific input states, one certain output state). Here, the basic unit of transaction is the binary digit ("bit"), whose state is either 0 or 1. In a conventional semiconductor, these two states are represented by low and high voltage levels within transistors.

In a quantum computer, the structure is radically different. Its basic unit of registering state is the qubit, which at one level also stores a 0 or 1 state (actually 0 and/or 1, which I'll confuse you within a moment). Instead of transistors, a quantum computing obtains its qubits by bombarding atoms with electrical fields at perpendicular angles to one another, the result is to line up the ions but also keep them conveniently and equivalently separated. When these ions are separated by just enough space, their orbiting electrons become the home addresses, if you will, for qubits.

SPIN, ONE WAY OR THE OTHER

While a conventional computer focuses on voltage, a quantum system is (passively) concerned with one aspect of electrons at the quantum level, called spin. Yes, this has to do with the electron's angular momentum. The reason we use the term "quantum" at the subatomic level of physics is because of the indivisibility of what we may observe, such as the amount of energy in a photon (a particle of light). Spin is one of these delightfully indivisible components, representing the angular momentum of an electron as it orbits the nucleus of an atom. The spin of an electron is always, as physicists calculate it, 1/2; the only difference here is polarity, which very simply may be either "up" or "down."

It's the "up" or "down" state of electron spin that corresponds to the "1" and "0" of the typical binary digit. Yet it's here where quantum computing makes a sharp turn into a logical black hole, through a tunnel of white noise, and jettisons us helplessly into a whimsically devious universe whose laws and principles seem concocted by the University of Toontown.

SUPERPOSITION AND WHY YOU CAN'T SEE IT

A qubit maintains the quantum state for one electron. When no one is looking at it, it can attain the "1" and "0" state simultaneously. If you look at it, you won't see this happen, and if it was happening before, it immediately stops. (This is literally true.) Yet the fact that the qubit's electron was spinning both directions at once, is verifiable after the fact. Quantum mechanics calls this simultaneous state of both here and there superposition. It is impossible to witness an electron in a state of superposition because witnessing requires the very exchange of photons that causes such a superposition to collapse.

As one Fordham University lecturer put it, "We don't understand this, but get used to it."

There are multiple possible states of superposition. Here is why each additional qubit in a quantum system is more influential than the last: In a system with n qubits, the number of possible superposition states for each qubit is 2n. If you remember the history of binary computers, when 16-bit processors were first replaced with 32-bit processors, suddenly a byte's maximum unsigned value was no longer 65,535 but 4,294,967,295. In a quantum system, each qubit in a 32-unit rack of atoms would have 4,294,967,296 possible superposition states.

Why does this matter, if the final state only collapses to 0 or 1 anyway when someone or something takes the bold step of just looking at the thing? Because before that collapse takes place, each of these states is a valid, possible value. (This is why you don't hear a lot about quantum computers needing much memory.) During that strange, black-box period where it can work unobserved and undisturbed, a quantum processor is capable of performing real algorithmic functions on units that are much less like binary digits than they are like wheels in one of Charles Babbage's difference engines -- except with billions of settings rather than just 10.


Licensed under Creative Commons

Instead of giant wheels, quantum engineers have chosen a better way of representing qubits' spin states. More specifically, they borrowed it from a Swiss emigrant physicist to the US, Felix Bloch, who shared the 1952 Nobel Prize in physics for discovering the principle of nuclear magnetic resonance. If you can imagine a billiard ball with one dot, and an imaginary line from the core of the ball through the center of the dot and outward as a vector, then you can picture a Bloch sphere like the one shown at right. Each superposition state a qubit may take may be represented by a vector in a Bloch sphere, which you can think of in terms of angles along the x and y-axes of the sphere. Using ordinary geometry, the vector may be expressed as a function of the cosine of that angle to the z-axis, added to the sine of that angle to the z-axis.

WHAT THE FIRST QUANTUM PROGRAMS WILL LOOK LIKE

The trick in writing a quantum algorithm is to imagine that you could actually see, or measure, qubits in their superposition states so that you can instruct them as to what happens next and cause adjustments to those states. In reality, the very act of attempting to witness superposition results in decoherence -- the reversion of qubits to their classical 0 or 1 states. Decoherence always happens eventually to a quantum system, often after a few minutes, or if you're lucky, in under an hour.

The whole point of a quantum program becomes to take full advantage of the ability to manipulate which way all these billiard balls are pointing while no one is looking, prior to their decoherence. There are two types of quantum programs, which function very differently from one another:


  • A program using quantum gates follows Richard Feynman's original suggestion: That there are other forms of logic within the quantum space. With a binary computer, an AND or an OR gate would take two discrete voltage inputs as bits and yield one certain output. With gates in a quantum circuit -- the quantum counterpart of a classical electrical circuit -- several qubits may be used as inputs, and the result may be some form of a superposition state, which the Bloch sphere representation breaks down into mathematical values -- including, quite likely, complex numbers.
  • A quantum annealing system, such as the kind D-Wave currently produces, takes a very different route. Instead of establishing a quantum circuit, an annealer translates formulas (called "Hamiltonians") that describe the physical state of the quantum system, into actual physical states. While any quantum computing may use one Hamiltonian to describe the initial state, an annealer uses successive Hamiltonians to represent minute changes in the desired state of the system, in very incremental steps along the way to the final desired state. Each step knocks the qubits around, in such a way that their state at the final step represents the set of probabilities that form the final solution. (One researcher likened this to shaking marbles around in an egg crate, with each shake perfectly programmed.) Skeptics of this process are wont to point out that this is not the system Feynman first proposed, and thus either directly assert that an annealing system is not a true quantum computing, or indirectly suggest that no real quantum computing presently exists. It's fair to assume such skeptics do not presently have contracts with NASA, Google, or Lockheed Martin.


The real prospects for a quantum ecosystem

If every "revolutionary" technology were guaranteed financial or market success, you'd be holding in your hand today a voice-controlled edge processor with 3D transistors powered by a dime-sized superconductor with a half-century lifespan, rather than whatever it is you're using now.

Quantum computing will not truly succeed, even when it does completely exist, unless there's a viable business model for it. It's often presented with enough bombast that you might think customers would form queues around city blocks waiting for it to arrive. But because a full quantum computer is not, nor ever will be, portable (unlike a quantum compass, where detecting disturbances is the actual goal), the only way to make it commercial is by offering it as a service, similar to how laboratories and universities offer supercomputer services today.

It would not be a "quantum cloud." Cloud computing implies some kind of tenancy, a leasing of virtual computing capacity or, in the case of so-called serverless technology, the use of a solution. There is no division of tenancy in the quantum space; it sets up the Hamiltonian situation, runs the algorithm or the annealing pattern, lets the system blow up, and renders the results as likelihoods. Time is not a factor; a harder problem may not take measurably longer than a simpler one -- so leasing on a per-minute basis is pointless.

Which leaves the per-solution option, similar to serverless. But since solutions are probabilities rather than certainties, and subject to variations, inevitably customers will question the value of the solutions they're getting. If they have to pay by the solution, they're not looking for a deal like "Bertie Bott's Every Flavor Beans" where one flavor may be blueberry and the next earwax. At some point, quality of service will inevitably enter the discussion.

It would appear quantum computing's early adopters would probably include all those folks looking to demolish RSA-based cryptography at the first opportunity. But for the makers of quantum computers to start profiting from them, they'll want a more stable customer base than just wannabe hackers or former superpower countries with unresolved grief issues. They'll need to foster communities of scientific and educational developers willing to learn the rules and practices of a completely new universe so they can contribute solutions to the previously unfathomable problems facing our own world.




SOURCE:

2 comments:

  1. Spot on with this write-up, I truly think this website needs much more consideration. I’ll probably be again to read much more, thanks for that info.

    Website
    Information

    ReplyDelete
  2. I discovered this kind of large amount of fascinating stuff in your online journal particularly its discourse. From the huge levels of remarks on your articles, I figure I'm not in the slightest alone having all of the pleasure here! keep doing awesome…
    singapore seo consultant

    ReplyDelete