Large scale constrained linear regression revisited: Faster algorithms via preconditioning

6Citations
Citations of this article
21Readers
Mendeley users who have this article in their library.

Abstract

In this paper, we revisit the large-scale constrained linear regression problem and propose faster methods based on some recent developments in sketching and optimization. Our algorithms combine (accelerated) mini-batch SGD with a new method called two-step preconditioning to achieve an approximate solution with a time complexity lower than that of the state-of-the-art techniques for the low precision case. Our idea can also be extended to the high precision case, which gives an alternative implementation to the Iterative Hessian Sketch (IHS) method with significantly improved time complexity. Experiments on benchmark and synthetic datasets suggest that our methods indeed outperform existing ones considerably in both the low and high precision cases.

Cite

CITATION STYLE

APA

Wang, D., & Xu, J. (2018). Large scale constrained linear regression revisited: Faster algorithms via preconditioning. In 32nd AAAI Conference on Artificial Intelligence, AAAI 2018 (pp. 1439–1446). AAAI press. https://doi.org/10.1609/aaai.v32i1.11522

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free