Sparse Online Learning With Kernels Using Random Features for Estimating Nonlinear Dynamic Graphs

7Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Online topology estimation of graph-connected time series is challenging in practice, particularly because the dependencies between the time series in many real-world scenarios are nonlinear. To address this challenge, we introduce a novel kernel-based algorithm for online graph topology estimation. Our proposed algorithm also performs a Fourier-based random feature approximation to tackle the curse of dimensionality associated with kernel representations. Exploiting the fact that real-world networks often exhibit sparse topologies, we propose a group-Lasso based optimization framework, which is solved using an iterative composite objective mirror descent method, yielding an online algorithm with fixed computational complexity per iteration. We provide theoretical guarantees for our algorithm and prove that it can achieve sublinear dynamic regret under certain reasonable assumptions. In experiments conducted on both real and synthetic data, our method outperforms existing state-of-the-art competitors.

Cite

CITATION STYLE

APA

Money, R. T., Krishnan, J. P., & Beferull-Lozano, B. (2023). Sparse Online Learning With Kernels Using Random Features for Estimating Nonlinear Dynamic Graphs. IEEE Transactions on Signal Processing, 71, 2027–2042. https://doi.org/10.1109/TSP.2023.3282068

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free