We introduce and analytically illustrate that hitherto unexplored imaginary components of out-of-time order correlators can provide unprecedented insight into the information scrambling capacity of a graph neural network. Furthermore, we demonstrate that it can be related to conventional measures of correlation like quantum mutual information and rigorously establish the inherent mathematical bounds (both upper and lower bound) jointly shared by such seemingly disparate quantities. To consolidate the geometrical ramifications of such bounds during the dynamical evolution of training we thereafter construct an emergent convex space. This newly designed space offers much surprising information including the saturation of lower bound by the trained network even for physical systems of large sizes, transference, and quantitative mirroring of spin correlation from the simulated physical system across phase boundaries as desirable features within the latent subunits of the network (even though the latent units are directly oblivious to the simulated physical system) and the ability of the network to distinguish exotic spin connectivity (volume law vs area law). Such an analysis demystifies the training of quantum machine learning models by unraveling how quantum information is scrambled through such a network introducing correlation surreptitiously among its constituent subsystems and open a window into the underlying physical mechanism behind the emulative ability of the model.
CITATION STYLE
Sajjan, M., Singh, V., Selvarajan, R., & Kais, S. (2023). Imaginary components of out-of-time-order correlator and information scrambling for navigating the learning landscape of a quantum machine learning model. Physical Review Research, 5(1). https://doi.org/10.1103/PhysRevResearch.5.013146
Mendeley helps you to discover research relevant for your work.