Quantum technologies hold the promise to revolutionise our society with ground-breaking applications in secure communication, high-performance computing and ultra-precise sensing. One of the main features in scaling up quantum technologies is that the complexity of quantum systems scales exponentially with their size. This poses severe challenges in the efficient calibration, benchmarking and validation of quantum states and their dynamical control. While the complete simulation of large-scale quantum systems may only be possible with a quantum computer, classical characterisation and optimisation methods (supported by cutting edge numerical techniques) can still play an important role. Here, we review classical approaches to learning quantum systems, their correlation properties, their dynamics and their interaction with the environment. We discuss theoretical proposals and successful implementations in different physical platforms such as spin qubits, trapped ions, photonic and atomic systems, and superconducting circuits. This review provides a brief background for key concepts recurring across many of these approaches, such as the Bayesian formalism or Neural Networks, and outlines open questions.
|Journal||Nature Reviews Physics|
|Publication status||Accepted/In press - 14 Nov 2022|