AutoHOOT: Automatic high-order optimization for tensors

5Citations
Citations of this article
23Readers
Mendeley users who have this article in their library.
Get full text

Abstract

High-order optimization methods, including Newton's method andits variants as well as alternating minimization methods, dominatethe optimization algorithms for tensor decompositions and tensornetworks. These tensor methods are used for data analysis andsimulation of quantum systems. In this work, we introduce AutoHOOT, the first automatic differentiation (AD) framework targetingat high-order optimization for tensor computations. AutoHOOTtakes input tensor computation expressions and generates optimized derivative expressions. In particular, AutoHOOT contains anew explicit Jacobian / Hessian expression generation kernel whoseoutputs maintain the input tensors' granularity and are easy to optimize. The expressions are then optimized by both the traditionalcompiler optimization techniques and specific tensor algebra transformations. Experimental results show that AutoHOOT achievescompetitive CPU and GPU performance for both tensor decomposition and tensor network applications compared to existing ADsoftware and other tensor computation libraries with manuallywritten kernels. The tensor methods generated by AutoHOOT arealso well-parallelizable, and we demonstrate good scalability on adistributed memory supercomputer.

Cite

CITATION STYLE

APA

Ma, L., Ye, J., & Solomonik, E. (2020). AutoHOOT: Automatic high-order optimization for tensors. In Parallel Architectures and Compilation Techniques - Conference Proceedings, PACT (pp. 125–137). Institute of Electrical and Electronics Engineers Inc. https://doi.org/10.1145/3410463.3414647

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free