Fast and Robust Symmetric Image Registration Based on Distances Combining Intensity and Spatial Information

Intensity-based image registration approaches rely on similarity measures to guide the search for geometric correspondences with the high affinity between images. The properties of the used measures are vital for the robustness and accuracy of the registration. In this paper, a symmetric, intensity...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transactions on image processing Ročník 28; číslo 7; s. 3584 - 3597
Hlavní autoři: Ofverstedt, Johan, Lindblad, Joakim, Sladoje, Natasa
Médium: Journal Article
Jazyk:angličtina
Vydáno: United States IEEE 01.07.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:1057-7149, 1941-0042, 1941-0042
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Intensity-based image registration approaches rely on similarity measures to guide the search for geometric correspondences with the high affinity between images. The properties of the used measures are vital for the robustness and accuracy of the registration. In this paper, a symmetric, intensity interpolation-free, affine registration framework based on a combination of intensity and spatial information is proposed. The excellent performance of the framework is demonstrated on a combination of synthetic tests, recovering known transformations in the presence of noise, and real applications in biomedical and medical image registration, for both 2D and 3D images. The method exhibits greater robustness and higher accuracy than similarity measures in common use, when inserted into a standard gradient-based registration framework available as part of the open source Insight Segmentation and Registration Toolkit. The method is also empirically shown to have a low computational cost, making it practical for real applications. The source code is available.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1057-7149
1941-0042
1941-0042
DOI:10.1109/TIP.2019.2899947