Info Image

Preparing for a Quantum Future Starts with Building the Right Computational Architecture

Preparing for a Quantum Future Starts with Building the Right Computational Architecture Image Credit: Usis/BigStockPhoto.com

As the digital transformation era progresses and the need for large scale computing power (what we call Big Compute™) increases, mentions of “quantum computing” and interest in this technology are growing rapidly. The potential value at stake is enormous: BCG estimates that quantum computers could generate $5-10 billion in value for users and providers over the next three to five years, and $450-850 billion in value over the next 15 to 30 years. But as hype builds around quantum computing, enterprises need to determine how and when it will disrupt their businesses, and what to do today to prepare.

More enterprises are showing interest in implementing quantum computing because of their mounting compute needs, but like most innovative technologies, the hardest part is getting started. A survey found that the biggest hurdle to quantum adoption is the complexity of integrating quantum with existing IT infrastructure, a response shared by 49% of organizations that have adopted or planned to adopt quantum computing. What’s more, 37% of these organizations cite a lack of advanced computational capabilities to build on as another major hurdle to quantum adoption. Despite these challenges, the potential of a broad quantum advantage in the coming years means companies should start the process of becoming “quantum-ready" to prepare. The technology potential is increasing rapidly, so being left behind could happen fast.

Most work in quantum adoption today is focused on identifying uses cases - for good reason. Many businesses have a bevy of potential uses cases, and it’s a not always clear which computational challenges are a good fit for quantum, especially in the near-term while the technology is still maturing. This is a sensible place to start, and it usually includes working with data scientists, new talent with relevant expertise, and outside experts to identify problems that quantum can address. Interdisciplinary teams across research, engineering, and business will bring diverse perspectives on where quantum could add value. But while this is a critical first step, don’t get lost in the algorithmic puzzle of fitting quantum to your business problems. Once you’ve identified your business’s best fit, the work is just beginning.

Wrangling the growing diversity of quantum and classical compute options

Quantum computing brings an explosion of options when it comes to choosing where to send a compute job. There are multiple competing quantum device technologies (each with multiple companies working in that area). A “qubit,” the fundamental unit of quantum computing, can be electrons in a superconductor, ions trapped by lasers in a vacuum, and even impurities in a diamond. Because of differences in how these techniques work, each may end up excelling at certain types of computational problems. These new animals in the quantum zoo extend an already large menagerie of classical computing options such as CPUs, GPUs, TPUs, FPGAs, and more. Managing this diversity of compute successfully is going to bring both great rewards and great challenges.

Additionally, quantum computing will only exist in tandem with classical computing and will need to be integrated into current computational architectures. A fully quantum computing architecture will never be seen - there are many tasks where classical computers will always be better than quantum computers. Even well-suited quantum applications still require a lot of classical computing to support the quantum device.

Addressing architecture challenges

Having the right computing architecture in place is key to getting the most value out of quantum computing, so it’s important to address any existing challenges today. The good news is that machine learning (ML) workloads and high-performance computing (HPC) use cases can also benefit from a quantum-ready architecture. These domains all share complex mathematics, intensive compute and data requirements, and expensive, long run times to solve computationally complex problems.

To address these challenges, a unified computational workflow platform that orchestrates data, models, and compute across classical and quantum devices can be a major asset. Computational workflows are increasingly common in machine learning and analytics, and a particularly good fit for quantum. In addition to orchestrating the distribution of tasks across compute resources, workflows also help teams work together. Experts can focus on individual components where they have expertise, and the workflow describes how those components fit together into a complete solution. Lastly, workflow orchestration systems can also automate data management alongside code execution to support large-scale research and solutions.

Once you have the workflow, this computational recipe for your application, you then have a strategic question about the best way to execute it within certain constraints. Do you want to maximize speed, minimize cost, or get the highest quality result? While tools may automate these decisions in the future, for now plan to use benchmarking tests to compare the performance of different compute strategies before selecting one for production. Expect to routinely run benchmark tests as quantum hardware and software evolve to ensure the best configuration is being used for a given application. Tools to streamline benchmarking and assist in comparing execution strategies will ensure IT budgets are spent wisely.

Unifying infrastructure and building computational workflows today will help set up enterprises for a successful quantum future. Over the next few years, more advanced quantum devices will become available. If an organization’s current architecture for large scale compute is organized, modular, and flexible - and performing at the peak that classical compute can offer - they will be in a strong position to adopt quantum. Much like bolting a turbo charger on a jalopy, a fragmented, poorly constructed architecture riddled with bottlenecks can negate any speedups delivered by a quantum computer. Best to start getting your house in order now.

IT leaders should start by looking into potential use cases for quantum within their business in close collaboration with domain experts across business units, but don’t stop there. Modernizing compute infrastructure today with an eye towards quantum readiness will begin to pay-off now, and when a broad quantum advantage does come to fruition, the right infrastructure and expertise will already be in place.

NEW REPORT:
Next-Gen DPI for ZTNA: Advanced Traffic Detection for Real-Time Identity and Context Awareness
Author

Tim Hirzel has a BA in Computer Science from Harvard University and an MS from MIT’s Media Lab. He brings extensive experience in managing teams working on performing data science, machine learning, quantum chemistry, and device simulation. Since 2005, Tim has been a software engineer and architect in science-based technology startups. Today he is focused on delivering a best in class quantum computing platform for Zapata and its customers.

PREVIOUS POST

Push to Eliminate 'Digital Poverty' to Drive Demand for Satellite-Powered Broadband Connectivity Post Pandemic