Solving a least-squares problem with algorithmic differentiation and OpenMP

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Least-squares problems occur often in practice, for example, when a parametrized model is used to describe a behavior of a chemical, physical or an economic application. In this paper, we describe a method for solving least-squares problems that are given as a large system of equations. The solution combines the commonly used methods with algorithmic differentiation and shared-memory multiprocessing. The system of equations contains model functions that are independent from each other. This independence enables the usage of a multiprocessing approach. With help of algorithmic differentiation by source transformation, we obtain the derivative code of the residual function. The advantage of using source transformation is that we can transform the OpenMP pragmas of the input code into corresponding pendants in the derivative code. This is, in particular in the adjoint case, not a straightforward approach. We show the scaling properties of the derivative code and of the optimization process. © 2013 Springer-Verlag.

Cite

CITATION STYLE

APA

Förster, M., & Naumann, U. (2013). Solving a least-squares problem with algorithmic differentiation and OpenMP. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8097 LNCS, pp. 763–774). https://doi.org/10.1007/978-3-642-40047-6_76

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free