Abstract
Linear regression models are widely used in statistics, machine learning and system identification. They allow to face many important problems, are easy to fit and enjoy simple analytical properties. The simplest method to fit linear regression models is least squares whose systematic treatment is available in many textbooks, e.g., [35, Chap. 4], [12]. Linear regression models can be fitted also in different way and a class of methods that we will consider in this chapter is the so-called regularized least squares. It is an extension of least squares which minimizes the sum of the square loss function and a regularization term. This latter can take various forms, leading to several variants which have been applied extensively in theory as well as in practical applications. In this chapter, we will focus on these methods and introduce their fundamentals. In the first part of the appendix to this chapter, we also report some basic results of linear algebra useful for the reading.
Cite
CITATION STYLE
Pillonetto, G., Chen, T., Chiuso, A., De Nicolao, G., & Ljung, L. (2022). Regularization of Linear Regression Models. In Communications and Control Engineering (pp. 33–93). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-95860-2_3
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.