Divergence-based framework for diffusion tensor clustering, interpolation, and regularization

4Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper introduces a novel framework for diffusion tensor combination, which can be used for tensor averaging, clustering, interpolation, and regularization. The framework is based on the physical interpretation of the tensors as the covariance matrices of Gaussian probability distributions. The symmetric Kullback-Leibler divergence provides a natural distance measure on these distributions, which leads to a closed-form expression for the distance between any two diffusion tensors, as well as for the weighted average of an arbitrary number of tensors. We illustrate the application of our technique in four different scenarios: (a) to combine tensor data from multiple subjects and generate population atlases from ten young and ten older subjects, (b) to perform k-means clustering and generate a compact Gaussian mixture of multiple tensors, (c) to interpolate between tensors, and (d) to regularize (i.e., smooth) noisy tensor data. For boundary-preserving regularization, we also propose a non-linear two-stage smoothing algorithm that can be considered remotely similar to a median filter. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Rohlfing, T., Sullivan, E. V., & Pfefferbaum, A. (2007). Divergence-based framework for diffusion tensor clustering, interpolation, and regularization. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4584 LNCS, pp. 507–518). Springer Verlag. https://doi.org/10.1007/978-3-540-73273-0_42

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free