Simulation / Modeling / Design

Enabling Quantum Computing with AI

Building a useful quantum computer in practice is incredibly challenging. Significant improvements are needed in the scale, fidelity, speed, reliability, and programmability of quantum computers to fully realize their benefits. Powerful tools are needed to help with the many complex physics and engineering challenges that stand in the way of useful quantum computing.

AI is fundamentally transforming the landscape of technology, reshaping industries, and altering how we interact with the digital world. The ability to take data and generate intelligence paves the way for groundbreaking solutions to some of the most challenging problems facing society today. From personalized medicine to autonomous vehicles, AI is at the forefront of a technological revolution that promises to redefine the future, including many challenging problems standing in the way of useful quantum computing.

Quantum computers will integrate with conventional supercomputers and accelerate key parts of challenging problems relevant to government, academia, and industry. This relationship is described in An Introduction to Quantum Accelerated Supercomputing. The advantages of integrating quantum computers with supercomputers are reciprocal, and this tight integration will also enable AI to help solve the most important challenges standing in the way of useful quantum computing.

This post explores three key aspects of quantum computing that are supported by AI—the processor, error correction, and algorithms. It also explores some practical considerations for building an infrastructure where AI can most effectively enable quantum computing.

Improving quantum processors

Quantum processors, or QPUs, are physics and engineering marvels consisting of many fine-tuned systems for protecting and manipulating quantum bits (qubits). Qubits are extremely sensitive and the slightest sources of noise can corrupt a computation. Optimal control is a key aspect of operating a quantum processor and ensures that all necessary operations are performed on the qubits in such a way that minimizes any noise. AI is an important tool for determining optimal control sequences that produce the most quality results possible from a quantum processor.

Foundational work presented in Speedup for Quantum Optimal Control from Automatic Differentiation Based on Graphics Processing Units first demonstrated the utility of GPUs to accelerate automatic differentiation for quantum optimal control. This work resulted in a 19x speedup using a GPU to optimize the preparation of a 10 qubit GHZ state. This led to work presented in Model-Free Quantum Control with Reinforcement Learning, which explores the application of reinforcement learning to quantum optimal control problems.

AI has been applied to many other aspects of quantum device operation such as calibration and qubit readout, demonstrating its utility for reducing noise from multiple sources simultaneously during operation. 

Correcting errors from noisy qubits

Even the best-engineered quantum hardware processors will exhibit qubit noise levels short of the requirements necessary to run most algorithms. The theoretical solution to this is quantum error correction, a procedure that systematically removes errors from quantum computations and ensures reliable results.  

The general steps of a quantum error correction procedure involve encoding quantum information into logical qubits (composed of multiple noisy physical qubits), performing algorithmic operations on the logical qubits, decoding which errors occurred (if any), and correcting the appropriate error. Each step is complex and needs to be executed efficiently so that errors are corrected and the computation completes before any quantum information is lost or otherwise corrupted. 

Researchers are recognizing that speed, scalability, and an aptitude for complex pattern recognition make AI a fantastic tool for enabling many parts of quantum error correction workflows. For example, a team from the Max Planck Institute and the Friedrich Alexander University in Germany leveraged reinforcement learning to discover new quantum error correction codes and their respective encoders. For details, see Simultaneous Discovery of Quantum Error Correction Codes and Encoders with a Noise-Aware Reinforcement Learning Agent.

The decoding step is another promising target for AI, exemplified by Google’s recent work that explores how recurrent, transformer-based neural networks can be used for decoding a standard quantum error correction code known as the surface code. For more information, see Learning to Decode the Surface Code with a Recurrent, Transformer-Based Neural Network.

General depiction of a quantum workflow and the tasks where AI could support.
Figure 1. Quantum computing workflow with AI-enabled tasks labeled in green

Developing efficient quantum algorithms 

Circuit reduction is a critical part of a quantum workflow, ensuring algorithms are as efficient as possible and require minimal resources. The task is extremely difficult and usually requires solving complex optimization problems. Complexity increases when compiling an algorithm for a specific physical device and its unique constraints such as qubit topology.

The problem is so important that major players from across the quantum computing ecosystem are teaming up to find AI-enabled circuit reduction techniques. For example, Google DeepMind, Quantinuum, and the University of Amsterdam recently teamed up to develop AI methods for reducing the number of resource-intensive T-gates in a quantum circuit. Their results demonstrated that AI can enable significant improvements over state-of-the-art T-gate reduction techniques on a common benchmark set of quantum circuits.

Another issue with quantum algorithm design is finding efficient implementations of certain subroutines like state preparation. There are known quantum algorithms that promise a theoretical speedup but assume that the classical problem is already encoded as a quantum state. State preparation itself can be an exponentially challenging task and cannot be taken for granted. 

Chemistry is an excellent example where a good approximation of a molecular quantum state must be prepared before its energy can be computed. A collaboration between St. Jude Children’s Research Hospital, University of Toronto, and NVIDIA developed a method using a generative pretrained transformer (GPT) model for molecular state preparation (Figure 2). This was the first application of GPT for quantum algorithm design and can be generalized to applications beyond chemistry. To learn more, see The Generative Quantum Eigensolver (GQE) and Its Application for Ground State Search.

The image shows a generative model, which is a type of machine learning model that can be used to generate new data. The model is trained on a dataset of existing data, and then it can be used to generate new data that is similar to the training data.
Figure 2. The Generative Quantum Eigensolver method leverages GPT to prepare circuits for molecular simulation 

Explore AI for quantum computing

The value that practical quantum accelerated supercomputing will provide scientists, governments, and enterprises will only be realized by leveraging the power of AI. This is becoming increasingly clear and is catalyzing greater collaboration between AI and quantum experts.  

Effective AI for quantum development requires new tools that foster multidisciplinary collaboration, are highly optimized for each quantum computing task, and take full advantage of the hybrid compute capabilities available within a quantum accelerated supercomputing infrastructure.

NVIDIA is developing hardware and software tools that will enable AI for quantum at scales necessary to realize practical quantum accelerated supercomputing. Visit NVIDIA quantum computing to learn more.

Discuss (0)

Tags