Tuesday, February 9, 2021

From cosmology to quantum computers

Over the past few decades, the concepts of quantum mechanics have been sprinkled into our common lingo and news headlines in a gradual crescendo. Concepts that were mere theories at the beginning of the last century are now demonstrable facts. As uneasily as we sit with their implications, we carry about their applied technologies daily in our pockets. These concepts have been fascinating for me to study. Though they may seem a bit esoteric to most people, they are becoming ever more personally relevant to our common experiences. 

Some concepts that are astounding, but that are increasingly ordinary in our daily lives:

  • We control the quantum jumps of electrons to emit photons of the exact spectrum we want to light our living rooms with. Vice-versa, we use photons of specific spectra to control where we want electrons to go and how we want them to behave. (Photovoltaic effects, common in our technology as the light-emitting diode, also used to detect perturbations caused by the presence of a finger on a cellphone screen through electromagnetic repulsion, which sends signals to our computers chips.)
  • We've mastered material science to force electrons into super-chilled wave-crystal states that allow us to research un-Earthly states of matter that exist in few places in our universe at its current temperature. (Bose-Einstein condensates, which have bearing on superconductors and may at some point also be applied to our technology decades from now.)
  • We have been able to entangle wave-states of paired quantum particles, then beam them over distances in a vacuum, to then read the paired state information at a significant distance from the split. (Quantum teleportation, which may someday lead to entangled-particle communication many decades from now.)
  • We've built computers that can write to and read from single quantum particles in a super-imposed wave state of two spins. (Quantum computing, just entering into application recently and soon to be used at a greater scale.) 

It's the last item I'd like to address in this blog post. Just last year, a quantum computer proved "supremacy" over classical computer circuits in calculation speed. While our ability to harness and control quantum particles is fantastic news for the advancement of our technology. It's a bit of bad news for our soon-to-be legacy computer encryption approaches. Specifically meaning, its has implications for the continuity of software and internet industries as we've come to depend on them. So it's worthy of attention. Over the next two years you may see or hear a lot more about it.

It takes a bit of time to explain why one would need to be "quantum-safe" in the context of these advances. The US government standards body, NIST, is currently in an open call for proposals to determine the new encryption standards for this "post-quantum" era, the way we speak of post-modern art. The goal of our government and contributing engineering teams is to protect our future software industry to prepare for the proliferation of quantum computers and a greater risk to decryption of secure data we use day to day. If you had awoken thinking we were just starting the quantum era now, you may wonder what is the nature of this era we're moving beyond, and wonder what are these computers are that use quantum effects in their operation. As is usual in my blog, I'd like to start the story with a digression about why I'm fascinated with this topic. If you haven't followed the emergence of this field, it may be interesting.

I remember as a teenager listening to a satirical BBC radio show called The Hitchhiker’s Guide to the Galaxy. In this show, Douglas Adams postulated a spaceship that could leap through spacetime to different locations simply by calculating the precise "improbability" of the spaceship being at any specific location. This concept was inspired by the idea of the “quantum wave function,” of quantum field theory. The wave function is a probabilistic formula for modeling wave-particle trajectories, applied to interpretation of the debris coming out of atom smashers. (See Feynman Diagrams for more on this.) We know that we can’t pin down the precise location of a particle, due to the Heisenberg uncertainty principle, so the wave function is an approximating model using statistics to refer to the locations a particle is most likely to be found. In his narrative, Douglas Adams reversed the concept of the quantum wave function such that a probabilistic calculation caused the effect of making the spaceship appear in random places in the universe. (Description of his hypothetical “improbability drive” is here.) The idea that particles could hop through spacetime outside of a linear trajectory was my first introduction to what people called the “weirdness” of quantum mechanics. These hops are referred to as quantum jumps, or in some specific cases quantum tunneling.  

(Side note: I tend to say “particle-waves” because the term particle is loaded as a concept implying that matter is tangible. Our current approach to describing sub-atomic components is to acknowledge that they behave as both particles and waves depending on how they are measured. We could say, “waves that were formerly known as particles” because we've learned from the last 100 years that our world is non-tangible in the particulate sense. What we generally see behave as object-like particles around us are actually energetic waves that appear as point particles when disturbed or obliterated. That’s the most peculiar thing about quantum mechanics that has fascinated me. What we think of as tangible is actually an emergent property of various energetic fields that react strongly against each other in spacetime. As atoms are mostly empty beyond these energetic wave interactions, it is an odd conception to think that everything around us that we interact with in tangible ways is mostly empty vacuum in a field of tightly-bound highly-energetic ripples of force. Another term that took getting used to is the idea of fused time+space that we just call spacetime now because Einstein's theories of relativity demonstrated the inter-connectedness of the dimensions of space with time.)

When I was studying physics in high school, I remember the day that my professor was discussing the Bohr model of atomic structure. After lab was done, I went to ask professor Rolfe what the other diagrams were on the back pages of the charts as I paged through them. They were peculiar-looking globular shapes that depicted the electron configurations of various elements in 3D. These maps showed the probabilistic location of electrons on the outer shells of atoms, where their negative charges bulged away from the positively-charged proton. Bohr diagrams depict the atom as a 2 dimensional disk, like the model of planets in our solar system, which makes it very easy to envision their chemical combinations with other atoms. Yet in experimental demonstrations we find that atoms are like bumpy balls of positive and negative charge with multiple poles. Professor Rolfe explained, "We refer to electron positions as 'orbitals', but they actually don't orbit the proton. Rather they buzz around in a field of space at specific areas on the outermost edges of the proton's positive charge field." "Wow! Can we study that next class?" I asked. "No. We have to focus on the curriculum, which is specific to chemical bonds. You'll get to quantum mechanics when you get to college," he explained.

One thing parents and teachers learn about kids is that the best way to challenge and inspire them is to tell them they can't do something! Professor Rolfe saw the twinkle in my eye and knew he was sending me off to the races. I talked with my father about it. He in turn suggested I delve into General Relativity first, and started me on the path toward my fascination with cosmology with a book called "Relativity Visualized." Einstein had been an early contributor to quantum mechanics. But to understand why Einstein was provoked into his study of particle-entanglement, it was important to understand the ideas behind his general and special theories of relativity. From there, I branched out to read more about quantum field theory and the new concepts of how relativity and quantum mechanics should eventually dovetail in cosmology as part of a unified theory of the four fundamental forces of nature. Nuclear strong force (which binds atoms together), nuclear weak force (which causes atomic radiation), electromagnetic force (governing electron and photon behaviors) and gravity (which is currently best described by the geometry of spacetime and has evaded an elegant tie into quantum field theory.)

Relativity was particularly strange for me to grasp: the equivalence of mass and energy, the inter-connectedness of space and time, the limits of light speed and the warping of the spacetime continuum described by the presence of mass/energy. While it didn't sit well in my Newtonian-focused understanding of space and causality, I could grasp the ideas of relativity's predictions, which are being confirmed on every astronomical observation that has happened since. If you've watched Nova or any other science shows about relativity you may have seen demonstrations of the spacetime continuum as a 2D surface that creates indentations where stars are present. That's indeed one great way to visualize gravitational distortions of spacetime. But when you consider the idea of time dilation for objects moving at near light speed, a different aspect of relativity from the mere presence of matter, you get a slightly more peculiar view of the implications of the nature of our spacetime. You get a sense of our universe being made of a kind of taffy-like constituency. 

The large scale spatial-temporal view of our cosmos is best grasped by thinking of the behaviors of gravity and light acting against the background tapestry of spacetime. Yet light waves, when studied on the microscopic scale behave much like the particles that we accept that we're bodily made of. One way to bridge our thinking from the cosmic scale down to the scale of our substantive selves is to focus on the similarities between electrons and photons. These two wave-particles that make up the dual components of the photoelectric effect are tightly bound to each other and yet seem so vastly different in their natures and observable behaviors. I'm calling it dual components because photons (light waves) are emitted/generated by the hop an electron makes when it leaps from a high-energy orbital to a lower resting state closer to the protons in the atomic nucleus. Conversely if an atom is struck by a photon and its energy absorbed, the electron hops to a higher orbital.

Photons have the distinction that they can move faster than all other known quantum wave-particles such as electrons and quarks. But the head-scratching really starts once you begin to delve into those specific behaviors of non-photon quanta. Why should the photon be able to travel at the fastest allowable speed known in our universe, but the other quanta are not? The photon is not a charged particle like the electron or quark. That's one clue. Why does not having charge imply that it gets to travel in the fast lane while other quantum waves can't? This puzzle continues to riddle the theorists who've been investigating the explanations such as (currently theoretical) non-space dimensions that we can't perceive directly which bridge our thinking to a hidden mechanics of our cosmos. But I shouldn't get too far down the rabbit hole here. There are great publications to follow on quantum-gravity theories, string theory, black holes and the holographic principle that I encourage interested folk to look up. (Greene, Hawking, Susskind and Scharf are good authors for the enthusiastic.) Though I find the cosmology topics fascinating, I'd like to put those concepts aside for the moment and to drill down into the aspects of quantum field theory that specifically apply to our emerging technology applications brought about by harnessing these behaviors for practical purposes by tweaking these tiny waves in our machinery. So leaving the "spacey" aspects of this topic there, I'll progress now the specifics of where we are applying these new capabilities to our machines in the imminent future.

Progress harnessing the weirdness of quantum wave/particles in computers

IQM Quantum Computer
IQM Quantum Computer
Digital technology for the mass consumer market of the last century has been dependent on our exquisite mastery of photo-electric effects in conductive metals, vacuum tubes and silicon wafers. Moore’s law has predicted continual leaps in the processing power of binary computer chips over the past decades based on our ability to pack logic gateways densely together on a chip. But at this point the advancement we are making with quantum computers is based on an entirely different nature of the logic gateway itself. It’s not because of just the smallness of our computer circuits that will result in the next leap in computing power. Rather, it’s because we’re jumping beyond the concept of a binary circuit completely. Put most simply, instead of creating a logic gateway of 0 or 1, we can now create a triple logic gateway that consists of 0, 1 and 1⁄2. (Not literally 1⁄2, but a juxtaposition representing 0 and 1, which I'll get to later.) That’s what all the buzz is about. This may seem like a small incremental jump for a single circuit. But on a meta-layer, if every logic gateway of a computer were a 3-bit instead of a 2-bit circuit, the advancement of an array of circuits in terms of computing power would be absolutely tremendous. The quantum computer is just that, an array of quantum bits that can coordinate calculations incredibly rapidly. 

In the United States, we had a moment of transition where the FCC and all TV manufacturers made a switch over to preferring digital over analog radio transmissions. All old TVs had to use a new digital converter to receive the new spectrum. Nothing like that is going to happen in this case. We’ll certainly always be using binary computers. They’re so useful. And we’re adept at producing them with low resource cost now. The new computer circuits we’re discussing don’t obviate the need for earlier technology. They’re just useful for entirely different purposes. The emergence of the calculator (Professor Rolfe used to call them "calcu-sooners") didn’t mean the abacus and slide-rule were no longer useful. Calculators are preferred because of their higher rate of performing calculations. Same will be so of quantum computers as they become more common. Anyone with a really powerful calcu-sooner will beat anyone who tries to conduct the same calculation with a contemporary binary calculator. For instance, the alleged speed of the computer cited in the quantum supremacy proof above is such that a calculation which would take a traditional binary super-computer 10,000 years to complete could be completed in 200 seconds with a quantum computer.

"How are they creating these cool calcu-sooners?" you may ask. For this we have to go back into the somewhat spacey stuff again. This time turning from the telescope to the microscope, we adjust the focus of our lens to peer at the nature of space and energy within atoms. It may sound strange to say we need to look at space here. But remember Einstein's direct correlation between energy's presence in spacetime warping it's fabric. Energy can't exist in space without altering it. Theories emerging since relativity have explored and tested this concept of the bindings of wave-particle motion to the underlying spacetime. It is at this sub-atomic scale that we see the peculiar behaviors of matter that yield the opportunity of manipulating the quantum bits in a near-frozen state. 

At the smallest scale, during the “physical” interactions we cause in atom smashers, we observe peculiar states of fermions and force-particles called bosons that spring out of ordinary atoms. We can observe, at some energy densities, exotic wave-particles springing into existence and nearly immediately disappearing as if they emerged from a sub-strata underneath our visible cosmos that is filled with something else entirely, summoned only by the energy density of these small but intense particle collisions. These exotic states of matter can only have a fleeting existence at our world's temperatures and densities. (Our universe formerly had much higher temperature and density 13.8 billion years ago. So these high-temperature interactions give us a way to glimpse the nature of wave-particles in spacetime from a different perspective, as they were more commonly once long ago.)

Going in the opposite temperature direction, when we chill particles in an area of vacuum as cold as is possible for humans, we see other exotic behaviors such as quantum particles existing in super-positions of overlapping opposite states of spin in a single moment. We also observe harmonized in ice-like "condensate" states, where a bunch of disparate particles behave in a uniform super-fluid. Why should matter be able to harmonize and be in super-positions of two states at once? Why this is, or the cause of this phenomena, is the subject of fascinating explorations and debate. We don’t precisely know yet. But suffice it to say that we can now super-chill atoms to the point that they start behaving in this exotic way in a way that is useful in a machine. We can now write and read using their hovering in this super-imposed state between two absolute values. As long as we keep them in this low temperature state we can harness their fluctuating state as a computer logic gateway. This is where the concept of spin-up/spin-down/both-at-once comes from in the 1/2 value mentioned above. By leveraging this "shimmer" between the spin states, we get our third state for the logic gateway that we need to create the quantum computer.

Were temperature to be slightly higher, altering and measuring the spin of a quantum bit would not be possible. The more bits we add into the array, the more chaotically the machine behaves, and the less utility we can derive from the system from the perspective of a useful computer. Achieving a reasonably robust array of quantum bits is incredibly challenging due to the temperatures needed to keep the array stable. The Sycamore quantum computer used in the above demonstration had only 54 bits. Lots of work goes into making these sensitive machines. So we can somewhat infer that they are not going to proliferate particularly rapidly in the commercial world due to economic constraints. Yet in a few years there will likely be thousands of them.

I'll leave it to the scholars and inventors of our next generation to talk about the wonderful things we'll be able to do with this advanced computing power. It is good news that humans are now on the brink of this great new capability. My near-term focus, as with much of the software industry's security engineers in the coming years, will be to ensure our legacy computer industry will be able to isolate quantum computers from the rest of our legacy networks and software. The way we build secure networks today is by leveraging mathematically complex hashing of data that is transmitted over the internet. By sprinkling in "RSA" level encryption hashes (our legacy standard), you can feel reasonably secure in using your home internet connection to log into your bank account with the assurance that no mid-stream interference will make your account vulnerable. So long as the encryption hash you use can't be factored by a computer during the time that your login credential is in use, you're safe. That's where the concern arises for the hypothesized concept of a decryption attempt leveraging a quantum computer. Most hashes take mere years to calculate on an ordinary computer. So if the factoring capability were greatly enhanced, even the currently most-secure hashes could be vulnerable.

Most of us don't need to be too concerned about this. Only the companies we rely on to secure our communications and web services do. That's why our industry is transitioning to post-quantum cryptography over the next few years. I hope by now this all makes sense and you can understand why the US Department of Commerce is doing this. Next time you read of post-quantum (insert-noun-here) in the news, you'll be generally up to speed on what they are talking about. It is simply trying to save our past and present from vulnerabilities that could emerge in the future. Therefore it's referred to as "future-proofing" so that we can go on using our legacy know-how and open web protocols safely in the future. 

We have seen ordinary citizens regularly be targeted by hackers no matter how obscure they may think they are. No certain person is likely to be the target of decryption attacks. If such attacks happen with a quantum computer, they will likely be focused on networks of information that are deemed valuable. The economics of cost implied by the complexity of quantum computers could lead us to conclude that in general they will be used to only solve very interesting opportunities and problems. But some assert that state secrets and financial institutions are so interesting that they will likely be the first vectors of exposure for the decryption experiments of state-sponsored actors who would gain from such access or information. Internet and software companies don't sit around thinking that their customers are too obscure to be interesting targets. So many of the companies we rely on daily will be implementing these new encryption standards when they are finalized. At some point you'll be asked to upgrade your software with a new set of tools and protocols that provide this new standard for security.

Entangled-particle cryptography

Post-quantum encryption that the NIST team at US Department of Commerce is researching doesn't involve quantum field theory specifically, as it is a defense against quantum computers. One of the most exciting concepts I'd enjoyed reading about is quantum effects of nonlocality and quantum entanglement, which lead up to what we've been hearing about in the media as "quantum teleportation." Brian Greene explained at length in his book, The Elegant Universe, how this idea of teleported information through the manipulation of particles, which were paired earlier in time, could create the potential for a perfect cryptography. 

The mechanism of using paired particles involves a process of first creating an entangled wave state between two particles, or splitting a single quantum wave in two, which can then separate spatially over time to transmit information when those particles are then read in remote locations. The concept stems from a prediction of Einstein, Podolsky and Rosen, abbreviated as the EPR Paradox. This process of causing interaction between two particles, inherently connected while spatially separated, was what Einstein termed "freaky action at a distance." We are now relatively proficient at using beam splitters or other light-control means to cause the entangled wave states needed to do this kind of rendering, sending and capturing of particle signatures at short distances. But they can't be used in long-distance telephony at present because the wave state involved is too fragile. You'd typically need a vacuum to utilize the signals, which are not easy to create on the surface of Earth over long distances.

The broad application of this kind of reliable stream of paired-particle plumbing for the purpose of messaging is many orders of magnitude more complex in scale than even the work involved in creating a quantum computer. So the practical application of the benefits is many decades further in the future than where we find ourselves today. So our best hope for a reliable encryption for our internet and software industries is to rely on means of encryption that are designed to be too-highly complex for a computer, of any type, to be able to decrypt. The methods to create this kind of incredibly strong encryption are themselves fascinating. So I'll get to those in a subsequent post.

While it will be hard to determine when quantum computers will be turned toward exploiting vulnerabilities in legacy encryption, there are many who would agree that it's time to start battening down the hatches now.