Abstract
Training quantum neural networks (QNNs) using gradient-based or gradient-free classical optimization approaches is severely impacted by the presence of barren plateaus in the cost landscapes. In this paper, we devise a framework for leveraging quantum optimization algorithms to find optimal parameters of QNNs for certain tasks. To cast the optimization problem of training QNN into the context of quantum optimization, the parameters in QNN are quantized—moved from being classical to being stored in quantum registers which are in addition to those upon which the QNN is performing its computation. We then coherently encode the cost function of QNNs onto relative phases of a superposition state in the Hilbert space of the QNN parameters. The parameters are tuned with an iterative quantum optimization structure using adaptively selected Hamiltonians. The quantum mechanism of this framework exploits hidden structure in the QNN optimization problem and hence is expected to provide beyond-Grover speed up, mitigating the barren plateau issue.
Author supplied keywords
Cite
CITATION STYLE
Liao, Y., Hsieh, M. H., & Ferrie, C. (2024). Quantum optimization for training quantum neural networks. Quantum Machine Intelligence, 6(1). https://doi.org/10.1007/s42484-024-00169-w
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.