Sub-pixel multi-scale fusion network for medical image segmentation

CNNs and Transformers have significantly advanced the domain of medical image segmentation. The integration of their strengths facilitates rich feature extraction but also introduces the challenge of mixed multi-scale feature fusion. To overcome this issue, we propose an innovative deep medical imag...

Full description

Saved in:
Bibliographic Details
Published in:Multimedia tools and applications Vol. 83; no. 41; pp. 89355 - 89373
Main Authors: Li, Jing, Chen, Qiaohong, Fang, Xian
Format: Journal Article
Language:English
Published: New York Springer US 01.12.2024
Springer Nature B.V
Subjects:
ISSN:1573-7721, 1380-7501, 1573-7721
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:CNNs and Transformers have significantly advanced the domain of medical image segmentation. The integration of their strengths facilitates rich feature extraction but also introduces the challenge of mixed multi-scale feature fusion. To overcome this issue, we propose an innovative deep medical image segmentation framework termed Sub-pixel Multi-scale Fusion Network (SMFNet), which effectively incorporates the sub-pixel multi-scale feature fusion results of CNN and Transformer into the architecture. In particular, our design consists of three effective and practical modules. Primarily, we utilize the Sub-pixel Convolutional Module to synchronize the extracted features at multiple scales to a consistent resolution. In the next place, we develop the Three-level Enhancement Module to learn features from adjacent layers and perform information exchange. Lastly, we leverage the Hierarchical Adaptive Gate to fuse information from other contextual levels through the Sub-pixel Convolutional Module. Extensive experiments on the Synapse, ACDC, and ISIC 2018 datasets demonstrate the effectiveness of the proposed SMFNet, and our method is superior to other competitive CNN-based or Transformer-based segmentation methods.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1573-7721
1380-7501
1573-7721
DOI:10.1007/s11042-024-20338-0