Introduction to Quantum

Print Friendly, PDF & Email

In this special guest feature, Elizabeth Ruetsch, General Manager for Quantum Engineering Solutions at Keysight Technologies, provides an introduction to quantum and provides a deep dive into the importance, benefits, security risks, as well as quantum’s disruptive potential. Elizabeth has been active in the electronics industry for 28 years – 22 of them in management positions involving assignments at HP, Agilent and for the past eight years, with Keysight. Until 2020, each of those positions involved significant responsibilities in sales, marketing, and product planning, where she operated out of bases in Boston, Colorado Springs, Beijing, and now Santa Rosa. In her current position, she focuses on working with customers in the aerospace, defense, university and commercial sectors on developments concerning quantum, a family of advanced technologies which, although exceptionally promising, are still largely pre-commercial. Elizabeth, who is known to her colleagues as Liz, holds a BSEE from Rutgers, an MBA from Boston University.

The word “quantum” is used as an umbrella term to refer to the emerging field of technologies that harness quantum mechanics to develop fundamentally new capabilities in established fields such as computing, communications, sensing, pharmaceutics, chemistry and materials research.

In its literal sense, the word “quantum” refers to the smallest unit or entity in a physical system that we describe using quantum mechanics. The reason we physicists have a separate formulation of mechanics for the quantum world is because on the scale of really, really small particles, the rules of classical physics don’t necessarily apply, and we observe strange, new behavior that we cannot explain with classical physics. Such phenomena include quantum interference and entanglement that allows particles that can be very far apart to be linked to one another.

What is the promise of quantum? Why is it important?

The promise of quantum is to push beyond the boundaries of classical physics by harnessing these quantum mechanical properties of matter. Depending on the context, this can offer entirely new ways of processing information that has the potential to be faster and more resource-efficient, which would enable us to, for example, calculate things we’ve never been able to calculate before, like the formation of proteins or predict the complex behavior of financial systems. 

Where is quantum’s disruptive potential?

There are quite a few areas in which quantum can potentially be disruptive. To just name a few, consider:

  • Optimization: quantum computers might be able to solve hard optimization problems much faster and enable us to even solve problems that are completely out of reach today (in terms of the classical computing resources needed). 
  • Pharmaceutical/Chemistry research and modeling: quantum simulation can help us understand how molecules and proteins form and lead to breakthroughs in chemistry & biology, drug discovery and healthcare.
  • Cybersecurity: a powerful quantum computer could potentially break existing encryption protocols that rely on factoring large numbers, such as RSA-based encryption protocols. Right now, there is no classical computer or algorithm that can do this within a reasonable amount of time – and so we have the opportunity to develop completely new types of encryption to keep information safe.

What are the benefits of quantum computing? What are the risks associated with this technology?

Quantum computing promises efficiency in processing power. The ability to process information faster opens up the possibility to drive fields such as fundamental research, optimization, information technology and pharmaceutics to beyond what we ever imagined possible while we had just classical computers.

There are anticipated risks to security.  It is theoretically known that a large-scale quantum computer can crack NSA encryption.  A large outstanding challenge at present is creating security protocols that are secure from both classical and quantum computers.

The unanticipated risks are that there are many yet unimagined applications for a substantially more powerful computer.  A quantum computer’s strength is processing Big Data which can have implications on personal privacy.

For example, roughly 1% of the US energy consumption goes into the production of fertilizer. This process is inefficient in part due to the complexity of simulation of the chemical reaction at a quantum mechanical level.  A quantum computer could be used to simulate biological/chemical processes such as nitrogen fixation in nitrogenase thus increasing efficiency in production and leading to a greener approach. 

What technical advancements are needed to take quantum computing from a niche existence to the mainstream? 

Right now, we are fundamentally limited by the stability of quantum systems over time and our ability to control them accurately. The unique sensitivity of quantum systems to their environment is what makes them so powerful for computing, but it is also what makes them difficult to control with a great degree of accuracy. Because of this, current quantum computers are very small (consisting only tens of quantum bits or qubits – classical computers have hundreds of millions of bits), and the computations we can perform with these small systems are often inaccurate.

To take quantum computing from its niche existence to the mainstream, we need to learn how to isolate quantum systems better from their environment and at the same time how to control them to a much greater degree of accuracy. We need to reduce the errors that we observe in quantum computations and then scale up the system to hundreds of millions of qubits.

How can we overcome these challenges?

We need to overcome the error problem in quantum computations through both innovations in quantum computing hardware and software. More research is needed to understand the error processes that occur in quantum systems and how to build hardware that is more resilient towards those errors. At the same time, advancements in software and how we implement certain algorithms are needed as we hit the physical limits of chip manufacturing capabilities.

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: @InsideBigData1 – https://twitter.com/InsideBigData1

Speak Your Mind

*