Since the time of Gauss, it has been generally accepted that ℓ2-methods of combining observations by minimizing sums of squared errors have significant computational advantages over earlier ℓ1-methods based on minimization of absolute errors advocated by Boscovich, Laplace and others. However, ℓ1-methods are known to have signifi- cant robustness advantages over ℓ2-methods in many applications, and related quantile regression methods provide a useful, complementary approach to classical least-squares estimation of statistical models. Combining recent advances in interior point methods for solving linear programs with a new statistical preprocessing approach for ℓ1-type problems, we obtain a 10- to 100-fold improvement in computational speeds over current (simplex-based) ℓ1-algorithms in large problems, demonstrating that ℓ1-methods can be made competitive with ℓ2-methods in terms of computational speed throughout the entire range of problem sizes. Formal complexity results suggest that ℓ1-regression can be made faster than least-squares regression for n sufficiently large and p modest. © 1997 Applied Probability Trust.
CITATION STYLE
Portnoy, S., & Koenker, R. (1997). The gaussian hare and the laplacian tortoise: Computability of squared-error versus absolute-error estimators. Statistical Science, 12(4), 279–296. https://doi.org/10.1214/ss/1030037960
Mendeley helps you to discover research relevant for your work.