Abstract
A preconditioned conjugate gradient method was implemented into an iteration on a program for data estimation of breeding values, and its convergence characteristics were studied. An algorithm was used as a reference in which one fixed effect was solved by Gauss-Seidel method, and other effects were solved by a second-order Jacobi method. Implementation of the preconditioned conjugate gradient required storing four vectors (size equal to number of unknowns in the mixed model equations) in random access memory and reading the data at each round of iteration. The preconditioner comprised diagonal blocks of the coefficient matrix. Comparison of algorithms was based on solutions of mixed model equations obtained by a single-trait animal model and a single-trait, random regression test-day model. Data sets for both models used milk yield records of primiparous Finnish dairy cows. Animal model data comprised 665,629 lactation milk yields and random regression test-day model data of 6,732,765 test-day milk yields. Both models included pedigree information of 1,099,622 animals. The animal model {random regression test-day model} required 122 {305} rounds of iteration to converge with the reference algorithm, but only 88 {149} were required with the preconditioned conjugate gradient. To solve the random regression test-day model with the preconditioned conjugate gradient required 237 megabytes of random access memory and took 14% of the computation time needed by the reference algorithm.
Author supplied keywords
Cite
CITATION STYLE
Lidauer, M., Strandén, I., Mäntysaari, E. A., Pösö, J., & Kettunen, A. (1999). Solving large test-day models by iteration on data and preconditioned conjugate gradient. Journal of Dairy Science, 82(12), 2788–2796. https://doi.org/10.3168/jds.S0022-0302(99)75536-0
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.