The Four Shifts Needed to Make Quantum Computing Part of Your Analytics Strategy

Print Friendly, PDF & Email

In this special guest feature, Christopher Savoie, Ph.D., J.D., founder and CEO of Zapata Computing, provides four necessary shifts organizations need to make to turn quantum computing from an exploration of an edge technology to production-ready technology that creates real value. Christopher is a published scholar in medicine, biochemistry, and computer science. His research and business interests over the years have focused on the intersection of machine learning, biology, and chemistry. Christopher is the original inventor of AAOSA, the A.I.-based natural language interface technology used to develop Apple’s Siri. He currently sits on the steering committee of the US Quantum Economic Development Consortium (QED-C). Christopher is also a licensed attorney and has served as the Vice-Chairman of the Big Data Committee of the American Bar Association.

It doesn’t take long in any quest for quantum computing insight to encounter the hype cycle we are currently in. The popular press waxes poetically about intractable problems solved in minutes by a quantum computer that would take a classical computer thousands or even millions of years. While there’s a case to make that such claims for certain use cases will eventually become reality (in fact, some reportedly already have), the vast majority of real, near-term value for businesses from quantum computing will be far less headline-grabbing  — but still extremely important from a bottom-line perspective. Most of these involve a new edge for corporate analytics and decision-making. Boring? Yes. But valuable? Absolutely.

I’ll go one step further and state that even those of us who think we work in the “quantum computing industry” aren’t really in the quantum computing business. The reality is that we’re in the business of helping companies make better data-driven decisions. This is particularly true in areas where classical computing bumps up against its limits, for instance with molecular simulation for drug discovery and materials discovery, delivery routing optimization and other incredibly complex computational problems.

Analytics generated by quantum devices, working in tandem with high-powered, high performance classical computers (HPC), can help companies push past existing limitations to create better models, find better optima and uncover correlations that are either not possible with classical computing alone or so difficult to solve that it would take many years to do so.

It’s important to highlight the “working in tandem” element here. Quantum solutions will always be a quantum-classical hybrid due to the need to pre- and post-process data. Plus, today’s quantum computers have relatively few qubits (the analogue to bits in classical computers). This means that the production applications of quantum computers will ironically include only very few, albeit extremely powerful, quantum steps.

The stage of “quantum-readiness”, wherein companies begin building quantum-classical hybrid applications, is the next step in our journey to create more value from the data that we have. From my perspective, as someone who leads a company and has invented novel quantum AI technologies, that is extremely exciting!

Speaking of perspective, it’s important to keep it in mind as we enter the next 3-5 years of quantum computing development. It behooves us all to humbly understand our position in the universe and where quantum fits — particularly in the enterprise, where these computers are first going to be used in production.

This begs the question: how do we actually make quantum computing useful in the near-term in a production-level environment?

Let’s Make This An Analytics Conversation

When we make the near-term value of quantum computing an analytics conversation and NOT a story about the magical qubits we possess, it becomes less about the technology “hype” (e.g., which hardware vendor has the highest number of qubits or fidelity levels this month) and more about how we’re going to use this new compute method to solve seemingly intractable business problems.

Today, we are finding these problem-solving use cases in the data science and machine learning (DSML) field. The way that quantum computing can approach DSML challenges, especially given the sheer quantity of data it can process and randomness it can govern, can give it an advantage that’s not just about processing speed per se, but also about the fit and accuracy of models. The net result is a real production advantage, even in the near-term.

The Four Shifts to Make Quantum Computing Viable

We’ve learned in our customer work that there are four necessary shifts organizations need to make to turn quantum computing from an exploration of an edge technology to production-ready technology that creates real value.

Shift 1: From an Insurance Policy to Built Capability

Many of the corporate and government organizations looking into quantum computing see it as a “technology insurance policy” that will prevent them from being late to the party, as opposed to proactively building production-ready assets. There’s a mindset of “We’re going to be disrupted by quantum, so we have to get ahead of it.” This mindset is helpful, but in practice finding quantum talent and building capabilities can’t be done overnight. Specifically, organizations can’t dabble in quantum with a few POCs, do some education, and call it readiness. It’s like playing a game not to lose instead of trying to win — and this will be a difficult game, much like AI adoption was. 

The truth is that one needs to start building infrastructure, partnerships and workforce development well ahead of the deployment phase, or one will already be behind the curve. We have seen this time and time again in technology adoption: with e-commerce, with cloud, with big data, and more recently with AI and machine learning.

The better way forward is to consider quantum computing as an integral part of current classical AI/ML strategies and workstreams — including DSML — while building that infrastructure now. In other words, build to win. This is the first on-ramp to long-term quantum computing success.

Shift 2: From Algorithms to Workflows

This shift requires moving away from using quantum algorithms as an innovative way to solve mathematical problems, to figuring out how quantum computers can help power an entire computational workflow that incorporates real data, outputs and dashboards that decision-makers need. It’s important to keep in mind that, while a small, specific part of this next-generation infrastructure will be quantum, most of it will still be classical. Quantum computing as it exists today is relatively powerful, but it does not operate in a vacuum. And quantum hardware will not live next to your data for the foreseeable future.

Successful shifts from algorithms to workflows require collaboration between many stakeholders and operators. The most effective workflow architectures help individuals focus on their areas of expertise as part of a larger effort. For example, an algorithm expert, ETL expert for supply chain management, and domain expert in delivery logistics may all contribute to the creation of one solution.

To give an example, the newest quantum algorithm alone is not going to provide a computational speed-up on a system that’s slowed down by data ingestion procedures because the DSML infrastructure and ETL workflow is poorly constructed. A holistic, collaborative method of integrating quantum computing into workflows is going to be increasingly important to those organizations looking to win — or the classical overhead could undo any advantage the quantum steps have produced.

Shift 3: From Fragmented to Unified

Organizations, especially at the enterprise and government levels, operate in a universe of highly fragmented data architecture and compute. Quantum computing is going to exacerbate this fragmentation. There could be HPC clusters, quantum hardware and software, ETL processes, databases, S3 buckets and Java frameworks — just to name a few of the key components.

While this data and compute universe might work to a certain extent, it probably isn’t efficient. What’s needed to unify all the components is a workflow that organizes the mess of data and compute swamps. The workflow’s job is to orchestrate and unify all the components (quantum and classical) to make efficient use of them in production. That’s how DSML can improve now with a boost from quantum-driven requirements.

Shift 4: From Exploration to Operationalization

If there’s anything I’ve learned from our work with enterprise and government it’s this: you can’t wait for the shiny, fault-tolerant quantum computer to show up years from now and just flip the switch to include quantum in your data and analytics processes. We learned this from classical DSML as well — it just doesn’t work that way. It’s not going to be one moment; it’s a journey, and organizations that haven’t done the work to create infrastructure and capability won’t have the option to include the power of quantum.

The way it does work is not only through model creation, but also by building QuantumOps and AIOps infrastructure, including pre- and post-processing. One example of this would be putting ETL jobs and multiple types of compute in the same workflow. Further, consider where the data and compute sit — public cloud, private cloud, on premise, HPC resources, etc.

The global consulting firm Gartner advises to focus not only on model building, but also model operationalization, because an alarming number of models developed for deployment are never actually operationalized. Specifically, they predict that by the end of 2024, 75% of enterprises will have shifted from just piloting AI to operationalizing it (which would produce a fivefold increase in streaming data and analytics infrastructures).

Quantum computing, I believe, is on a similar trajectory, although it trails AI in maturity. However, quantum makes this shift much more important and acute because we’re not just talking about pre-existing algorithms from the classical world, we’re adding a new level of complexity with computation using quantum physics at “quantum scale”. For that reason, we need to be careful about operationalizing quantum models at scale, so that quantum becomes a production technology and delivers impact instead of hype.

What Can You Do Today?

So, how can your organization use quantum computing to manifest the new edge for analytics and decision-making promised at the outset of this article? There are three steps.

  1. Engage data scientists and data consumers within business units to identify an AI/ML problem where quantum computing could unlock new speed-up or performance improvement. Generative models are a promising starting point.
  2. Create a real computational workflow with real data sources that leverages mostly classical compute, with one or two powerful quantum steps that can be achieved with today’s quantum devices.
  3. Deploy a quantum or quantum-inspired workflow that is forward-compatible as the technology matures.

These steps, though not easy, are real actions that you can take today to produce real results with near-term quantum devices. They transform messy data and analytics architectures into well-orchestrated workflows. This is the path away from hype and toward real business impact with quantum computing.

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: @InsideBigData1 – https://twitter.com/InsideBigData1

Speak Your Mind

*