Simulation / Modeling / Design

An Introduction to Quantum Accelerated Supercomputing

The development of useful quantum computing is a massive global effort, spanning government, enterprise, and academia. The benefits of quantum computing could help solve some of the most challenging problems in the world related to applications such as materials simulation, climate modeling, risk management, supply chain optimization, and bioinformatics.

Realizing the benefits of quantum computing requires integrating quantum computers into existing supercomputing infrastructures and scientific computing workflows and enabling domain scientists to program them in familiar languages and with familiar tools. 

This democratization of workflows and development of a robust and performant software stack is exactly what enabled GPUs to revolutionize supercomputing. Building from the same framework that made accelerated computing successful is critical for quantum computers to evolve from research projects to enablers of science. This idea is called quantum accelerated supercomputing.

This post provides a foundational understanding of quantum computers and how quantum accelerated supercomputing leverages their strengths and handles their weaknesses. It concludes with some practical considerations for developers looking to build robust and performant quantum accelerated workflows that will scale as quantum computers develop.

Video 1. Explore the cutting-edge world of quantum accelerated supercomputing

Building blocks of quantum computers: QPUs and qubits

Quantum accelerated supercomputing leverages quantum processing units (QPUs) to perform quantum computing tasks. The heart of a QPU is a collection of two-level quantum physical systems known as quantum bits, or qubits. Qubits can be created in a wide variety of ways, with current approaches leveraging trapped ions, light polarization, and current moving through superconducting loops to name a few. 

Contrary to a classical bit that can be in only the 0 or 1 state, a qubit can exist in a combination of both states simultaneously, enabling far more flexibility to encode information. This is known as a superposition and is why a QPU with N qubits can hold an exponential amount (2^N) of classical information. Multiple qubits interact through phenomena called entanglement and interference which allow for unique ways to process the quantum information held within the qubits. 

A full quantum state can never be directly accessed. The only way information is output from a quantum computer is by measuring each qubit which returns a 0 or 1 probabilistically and collapses its superposition to the respective classical state. The distribution of many repeated measurements (samples) provides additional insight into properties of the quantum state that cannot be directly observed. The state of the qubit is often represented as a vector on the Bloch Sphere depicted in Figure 1.

A diagram of a sphere centered at the origin of a set of x, y, and z axis.  The quantum states |0> and |1> are depicted on the north and south poles, which correspond to the intersection of the sphere with the positive z and negative z axis, respectively. A general state |psi>  is represented by a vector originating from the center of the sphere and terminating on the sphere.
Figure 1. A qubit is a small physical object such as an atom or superconductor that can exhibit quantum effects

Quantum computing hardware and algorithms

Quantum computing can be simplified and thought of as two primary components: hardware (the QPU) and quantum algorithm. The hardware is an incredibly complex and precisely engineered device that protects, manipulates, and measures the qubits. 

There are many sorts of qubits being utilized to build QPUs and each type requires a completely different hardware design. For example, a trapped ion QPU operates on qubits through lasers while a superconducting QPU does so with microwave pulses. Each architecture comes with associated pros and cons related to qubit quality, speed, scalability, and so on.

Qubits are extremely sensitive to the environment and even the smallest perturbations can cause decoherence (destruction of the quantum information) and result in erroneous computations. Avoiding decoherence is extremely difficult and is the primary barrier to realizing a useful QPU.

The second aspect of quantum computing is the quantum algorithm. A quantum algorithm is a set of mathematical operations that manipulates the quantum information stored in a set of qubits to produce a meaningful result when the qubits are measured. Quantum algorithms are generally represented as quantum circuits, as depicted in Figure 2. The two horizontal lines represent the two qubits in the system beginning in the |0> state. The following boxes and lines represent either single or two-qubit gate operations.

A diagram of a quantum circuit containing two horizontal lines.  Each line represents a qubit. The circuit is read left to right. The state |0>  appears to the left of each horizontal line. Each horizontal line terminates on the left with a box symbolizing a measurement. Between the initial states and the measurement symbols are symbols depicting gate operations on the qubits.
Figure 2. A simple quantum circuit that depicts the entangling of two qubits

The design of quantum algorithms is complex because it must take a real-world problem, consisting of classical information, and formulate it in such a way that can be mapped to a quantum system, manipulated as quantum information, and then transformed back into a classical solution. In general, a successful quantum algorithm will efficiently prepare a quantum state that, when sampled many times, produces the “correct answer” with a high probability.

Each operation in a quantum circuit corresponds to an extremely precise physical interaction with the target qubits and introduces noise into the system. The noise can rapidly accumulate and produce incoherent results. To run practical quantum algorithms, quantum error correction (QEC) codes will be necessary to ensure that the quantum information is protected by encoding  logical qubits using many noisy physical qubits. The development of robust and efficient QEC codes remains one of the largest barriers to solving practical problems on a QPU. 

What are some workflows that QPUs could accelerate? 

It is a common misconception that QPUs will accelerate any sort of computation. This will probably not be the case, as QPUs are well-suited for very specific tasks. One of the primary weaknesses of a quantum computer is the fact that information can only be extracted via nondeterministic measurements of the N qubits to produce a bitstring of length N. Therefore, it is important to understand the types of problems that are either theoretically proven or expected to have efficient implementations on a QPU, as listed below.

  • Simulating quantum systems: QPUs, quantum systems themselves, are naturally good at simulating other quantum systems. This could enable all sorts of fundamental science ranging from the exploration of new chemical reactions and materials to unlocking the mysteries of high-energy physics.
  • Optimization: The exponential amount of information held in a QPU could allow for new methods aimed at finding better solutions to large combinatorial optimization problems, benefitting diverse use cases including route planning, grid optimization, genetics, and portfolio selection.
  • AI and machine learning: The properties of QPUs make them amenable for building and sampling from complex distributions and deploying novel methods for finding patterns in high-dimensional data sets. These techniques could be highly portable and benefit almost any domain of science and industry.
  • Monte Carlo estimation: QPUs can obtain a theoretical quadratic speedup for Monte Carlo estimation, which would improve the accuracy and speed of obtaining risk metrics and financial predictions critical for getting an edge in the markets.
  • Fluid dynamics: Aerodynamic, weather, and reservoir simulations are examples of multiscale problems where systems of differential equations need to be solved with extreme precision using a large grid. QPUs are being explored as tools for solving systems of differential equations that enable far more precise fluid dynamic simulations.

These are just some of the potential QPU applications. As the hardware and algorithm research continue, it is likely that this list will expand, and novel applications and use cases will be discovered that are beyond our current understanding.

How will supercomputers enable quantum computing?

A second misconception about QPUs is that they will diminish the importance of today’s computers, as they efficiently execute subroutines usually reserved for the bulk of supercomputing resources. The inherent constraints of a QPU quickly falsify this notion. Deploying any quantum accelerated workflow will require significant support from high-performance CPUs and GPUs along with cutting-edge AI techniques. In practice, quantum accelerated supercomputing enables greater flexibility, so each processor can complete tasks it is best suited for.

Quantum error correction (QEC) was previously discussed as a prerequisite for practical quantum computing. QEC is a great example of how tight coupling of a QPU with HPC devices is an absolute necessity. QEC codes will need to repeatedly encode logical qubits, perform logical operations, and decode errors. 

The bulk of this work will need to happen in real-time on auxiliary CPUs and GPUs while a quantum algorithm is running. The classical processors must be tightly coupled to the QPU; otherwise, latency could make the QEC procedures prohibitively slow. It is likely that many QEC codes will require fast execution of computationally intense machine learning procedures that would require scalable accelerated computing.

In addition to QEC, other important computational tasks will be necessary before, after, or during a quantum algorithm, such as:

  • Optimizing compilation of quantum circuits for hardware execution
  • Intensive preprocessing and postprocessing routines
  • Efficient management of hardware control systems
  • Management of multi-QPU interactions

It is also important to note that traditional supercomputers currently play a key role in advancing quantum computing research on both the hardware and software front. Quantum simulation (that is, simulating quantum algorithms on classical computers) can obtain results beyond the capabilities of many physical QPUs, as qubits are still noisy and limited in quantity. 

These simulations can forward quantum research in a variety of ways, such as evaluating QPU noise profiles by producing noiseless benchmark data or by determining the viability of new algorithms based on their scaling behavior. Quantum simulation becomes exponentially costly for classical processors requiring state-of-the-art accelerated computing to push the limits of what simulations can be performed.

Enabling quantum accelerated workflows

Mature quantum accelerated supercomputing will not be reached at a single point in time. As QPUs and quantum algorithms improve, the scale and nature of meaningful problems they can solve will continually expand. Developing and testing quantum workflows today is necessary to prepare for how and when quantum accelerated supercomputing can be used to solve problems developers care about. 

Listed below are some considerations to keep in mind for developers interested in producing quantum accelerated workflows that are robust and well-positioned for practical application.

  1. QPU agnostic: Applications need to be able to target different QPUs as appropriate with minimal code modifications. Hardware agnostic development saves developer time and allows greater freedom in algorithm execution.
  2. Integration with classical architectures: Because QPUs will require so much operational support from supercomputers, workflows must be developed so they can easily integrate with scalable CPU and GPU architectures. Low-latency connections will be important warranting specialized architectures for time-sensitive tasks like QEC.
  3. High-performance libraries: To ensure QPU scalability, highly optimized software libraries must be developed and used so all classical tasks are executed efficiently and within the necessary time constraints.
  4. Accessibility: Quantum computing is highly interdisciplinary and will require direct interaction with domain scientists. Development needs to occur in a setting that is easily accessible to users from diverse computing backgrounds.
  5. User flexibility: Whoever is using the final workflow needs to be able to interact with the code at their preferred level. Users of the same application could range in preference from black-box to highly customizable research implementations.
  6. Stability: It is key that any quantum development  occurs on a platform that is stable and evolves with the quantum ecosystem.

Explore quantum accelerated supercomputing 

NVIDIA is working with partners from across the quantum ecosystem to develop powerful, scalable, and easy to use tools which enable governments, universities, and industrial corporations to build useful quantum accelerated supercomputing applications.

To learn more, visit NVIDIA Quantum Computing and join us in person or virtually for the NVIDIA GTC 2024 session, Defining the Quantum Accelerated Supercomputer.

Discuss (2)

Tags