Interpolation revisited

787Citations
Citations of this article
447Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Based on the theory of approximation, this paper presents a unified analysis of interpolation and resampling techniques. An important issue is the choice of adequate basis functions. We show that, contrary to the common belief, those that perform best are not interpolating. By opposition to traditional interpolation, we call their use generalized interpolation; they involve a prefiltering step when correctly applied. We explain why the approximation order inherent in any basis function is important to limit interpolation artifacts. The decomposition theorem states that any basis function endowed with approximation order can be expressed as the convolution of a B-spline of the same order with another function that has none. This motivates the use of splines and spline-based functions as a tunable way to keep artifacts in check without any significant cost penalty. We discuss implementation and performance issues, and we provide experimental evidence to support our claims.

Cite

CITATION STYLE

APA

Thévenaz, P., Blu, T., & Unser, M. (2000). Interpolation revisited. IEEE Transactions on Medical Imaging, 19(7), 739–758. https://doi.org/10.1109/42.875199

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free