Connecting weighted automata, tensor networks and recurrent neural networks through spectral learning

0Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In this paper, we present connections between three models used in different research fields: weighted finite automata (WFA) from formal languages and linguistics, recurrent neural networks used in machine learning, and tensor networks which encompasses a set of optimization techniques for high-order tensors used in quantum physics and numerical analysis. We first present an intrinsic relation between WFA and the tensor train decomposition, a particular form of tensor network. This relation allows us to exhibit a novel low rank structure of the Hankel matrix of a function computed by a WFA and to design an efficient spectral learning algorithm leveraging this structure to scale the algorithm up to very large Hankel matrices. We then unravel a fundamental connection between WFA and second-order recurrent neural networks (2-RNN): in the case of sequences of discrete symbols, WFA and 2-RNN with linear activation functions are expressively equivalent. Leveraging this equivalence result combined with the classical spectral learning algorithm for weighted automata, we introduce the first provable learning algorithm for linear 2-RNN defined over sequences of continuous input vectors. This algorithm relies on estimating low rank sub-blocks of the Hankel tensor, from which the parameters of a linear 2-RNN can be provably recovered. The performances of the proposed learning algorithm are assessed in a simulation study on both synthetic and real-world data.

Cite

CITATION STYLE

APA

Li, T., Precup, D., & Rabusseau, G. (2024). Connecting weighted automata, tensor networks and recurrent neural networks through spectral learning. Machine Learning, 113(5), 2619–2653. https://doi.org/10.1007/s10994-022-06164-1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free