High-order optimization methods, including Newton's method andits variants as well as alternating minimization methods, dominatethe optimization algorithms for tensor decompositions and tensornetworks. These tensor methods are used for data analysis andsimulation of quantum systems. In this work, we introduce AutoHOOT, the first automatic differentiation (AD) framework targetingat high-order optimization for tensor computations. AutoHOOTtakes input tensor computation expressions and generates optimized derivative expressions. In particular, AutoHOOT contains anew explicit Jacobian / Hessian expression generation kernel whoseoutputs maintain the input tensors' granularity and are easy to optimize. The expressions are then optimized by both the traditionalcompiler optimization techniques and specific tensor algebra transformations. Experimental results show that AutoHOOT achievescompetitive CPU and GPU performance for both tensor decomposition and tensor network applications compared to existing ADsoftware and other tensor computation libraries with manuallywritten kernels. The tensor methods generated by AutoHOOT arealso well-parallelizable, and we demonstrate good scalability on adistributed memory supercomputer.
CITATION STYLE
Ma, L., Ye, J., & Solomonik, E. (2020). AutoHOOT: Automatic high-order optimization for tensors. In Parallel Architectures and Compilation Techniques - Conference Proceedings, PACT (pp. 125–137). Institute of Electrical and Electronics Engineers Inc. https://doi.org/10.1145/3410463.3414647
Mendeley helps you to discover research relevant for your work.