Fast and Robust Symmetric Image Registration Based on Distances Combining Intensity and Spatial Information

Intensity-based image registration approaches rely on similarity measures to guide the search for geometric correspondences with the high affinity between images. The properties of the used measures are vital for the robustness and accuracy of the registration. In this paper, a symmetric, intensity...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing Jg. 28; H. 7; S. 3584 - 3597
Hauptverfasser: Ofverstedt, Johan, Lindblad, Joakim, Sladoje, Natasa
Format: Journal Article
Sprache:Englisch
Veröffentlicht: United States IEEE 01.07.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Schlagworte:
ISSN:1057-7149, 1941-0042, 1941-0042
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Intensity-based image registration approaches rely on similarity measures to guide the search for geometric correspondences with the high affinity between images. The properties of the used measures are vital for the robustness and accuracy of the registration. In this paper, a symmetric, intensity interpolation-free, affine registration framework based on a combination of intensity and spatial information is proposed. The excellent performance of the framework is demonstrated on a combination of synthetic tests, recovering known transformations in the presence of noise, and real applications in biomedical and medical image registration, for both 2D and 3D images. The method exhibits greater robustness and higher accuracy than similarity measures in common use, when inserted into a standard gradient-based registration framework available as part of the open source Insight Segmentation and Registration Toolkit. The method is also empirically shown to have a low computational cost, making it practical for real applications. The source code is available.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1057-7149
1941-0042
1941-0042
DOI:10.1109/TIP.2019.2899947