Parallel matters: Efficient polyp segmentation with parallel structured feature augmentation modules

Saved in:
Bibliographic Details
Title: Parallel matters: Efficient polyp segmentation with parallel structured feature augmentation modules
Authors: Guo, Qingqing, Fang, Xianyong, Wang, Kaibing, Shi, Yuqing, Wang, Linbo, Zhang, Enming, Liu, Zhengyi
Contributors: Lund University, Profile areas and other strong research environments, Lund University Profile areas, LU Profile Area: Light and Materials, Lunds universitet, Profilområden och andra starka forskningsmiljöer, Lunds universitets profilområden, LU profilområde: Ljus och material, Originator, Lund University, Faculty of Engineering, LTH, LTH Profile areas, LTH Profile Area: Nanoscience and Semiconductor Technology, Lunds universitet, Lunds Tekniska Högskola, LTH profilområden, LTH profilområde: Nanovetenskap och halvledarteknologi, Originator, Lund University, Profile areas and other strong research environments, Strategic research areas (SRA), NanoLund: Centre for Nanoscience, Lunds universitet, Profilområden och andra starka forskningsmiljöer, Strategiska forskningsområden (SFO), NanoLund: Centre for Nanoscience, Originator, Lund University, Faculty of Medicine, Department of Clinical Sciences, Malmö, Diabetes - Islet Patophysiology, Lunds universitet, Medicinska fakulteten, Institutionen för kliniska vetenskaper, Malmö, Diabetes - öpatofysiologi, Originator
Source: IET Image Processing. 17(8):2503-2515
Subject Terms: Natural Sciences, Computer and Information Sciences, Computer graphics and computer vision, Naturvetenskap, Data- och informationsvetenskap (Datateknik), Datorgrafik och datorseende, Bioinformatics (Computational Biology), Bioinformatik (Beräkningsbiologi)
Description: The large variations of polyp sizes and shapes and the close resemblances of polyps to their surroundings call for features with long-range information in rich scales and strong discrimination. This article proposes two parallel structured modules for building those features. One is the Transformer Inception module (TI) which applies Transformers with different reception fields in parallel to input features and thus enriches them with more long-range information in more scales. The other is the Local-Detail Augmentation module (LDA) which applies the spatial and channel attentions in parallel to each block and thus locally augments the features from two complementary dimensions for more object details. Integrating TI and LDA, a new Transformer encoder based framework, Parallel-Enhanced Network (PENet), is proposed, where LDA is specifically adopted twice in a coarse-to-fine way for accurate prediction. PENet is efficient in segmenting polyps with different sizes and shapes without the interference from the background tissues. Experimental comparisons with state-of-the-arts methods show its merits.
Access URL: https://doi.org/10.1049/ipr2.12813
Database: SwePub
Description
Abstract:The large variations of polyp sizes and shapes and the close resemblances of polyps to their surroundings call for features with long-range information in rich scales and strong discrimination. This article proposes two parallel structured modules for building those features. One is the Transformer Inception module (TI) which applies Transformers with different reception fields in parallel to input features and thus enriches them with more long-range information in more scales. The other is the Local-Detail Augmentation module (LDA) which applies the spatial and channel attentions in parallel to each block and thus locally augments the features from two complementary dimensions for more object details. Integrating TI and LDA, a new Transformer encoder based framework, Parallel-Enhanced Network (PENet), is proposed, where LDA is specifically adopted twice in a coarse-to-fine way for accurate prediction. PENet is efficient in segmenting polyps with different sizes and shapes without the interference from the background tissues. Experimental comparisons with state-of-the-arts methods show its merits.
ISSN:17519659
17519667
DOI:10.1049/ipr2.12813