Google Research

Investigating Quantum Approximate Optimization Algorithms under Bang-bang Protocols

  • Daniel Liang
  • Li Li
  • Stefan Leichenauer
Phys. Rev. Research, vol. 2 (2020), pp. 033402

Abstract

The quantum approximate optimization algorithm (QAOA) is widely seen as a possible usage of noisy intermediate-scale quantum (NISQ) devices. We analyze the algorithm as a bang-bang protocol with fixed total time and a randomized greedy optimization scheme. We investigate the performance of bang-bang QAOA on MAX-2-SAT, finding the appearance of phase transitions with respect to the total time. As the total time increases, the optimal bang-bang protocol experiences a number of jumps and plateaus in performance, which match up with an increasing number of switches in the standard QAOA formulation. At large times, it becomes more difficult to find a globally optimal bang-bang protocol and performances suffer. We also investigate the effects of changing the initial conditions of the randomized optimization algorithm and see that better local optima can be found by using an adiabatic initialization.

Learn more about how we do research

We maintain a portfolio of research projects, providing individuals and teams the freedom to emphasize specific types of work