Hierarchical multimodal image registration based on adaptive local mutual information

10Citations
Citations of this article
38Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

We propose a new, adaptive local measure based on gradient orientation similarity for the purposes of multimodal image registration. We embed this metric into a hierarchical registration framework, where we show that registration robustness and accuracy can be improved by adapting both the similarity metric and the pixel selection strategy to the Gaussian blurring scale and to the modalities being registered. A computationally efficient estimation of gradient orientations is proposed based on patch-wise rigidity. We have applied our method to both rigid and non-rigid multimodal registration tasks with different modalities. Our approach outperforms mutual information (MI) and previously proposed local approximations of MI for multimodal (e.g. CT/MRI) brain image registration tasks. Furthermore, it shows significant improvements in terms of mTRE over standard methods in the highly challenging clinical context of registering pre-operative brain MRI to intra-operative US images. © 2010 Springer-Verlag.

Cite

CITATION STYLE

APA

De Nigris, D., Mercier, L., Del Maestro, R., Louis Collins, D., & Arbel, T. (2010). Hierarchical multimodal image registration based on adaptive local mutual information. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6362 LNCS, pp. 643–651). https://doi.org/10.1007/978-3-642-15745-5_79

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free