Cascade neural network-based joint sampling and reconstruction for image compressed sensing

Most deep learning-based compressed sensing (DCS) algorithms adopt a single neural network for signal reconstruction and fail to jointly consider the influences of the sampling operation for reconstruction. In this paper, we propose a unified framework, which jointly considers the sampling and recon...

Full description

Saved in:
Bibliographic Details
Published in:Signal, image and video processing Vol. 16; no. 1; pp. 47 - 54
Main Authors: Zeng, Chunyan, Ye, Jiaxiang, Wang, Zhifeng, Zhao, Nan, Wu, Minghu
Format: Journal Article
Language:English
Published: London Springer London 01.02.2022
Springer Nature B.V
Subjects:
ISSN:1863-1703, 1863-1711
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Most deep learning-based compressed sensing (DCS) algorithms adopt a single neural network for signal reconstruction and fail to jointly consider the influences of the sampling operation for reconstruction. In this paper, we propose a unified framework, which jointly considers the sampling and reconstruction process for image compressive sensing based on well-designed cascade neural networks. Two sub-networks, which are the sampling sub-network and the reconstruction sub-network, are included in the proposed framework. In the sampling sub-network, an adaptive fully connected layer instead of the traditional random matrix is used to mimic the sampling operator. In the reconstruction sub-network, a cascade network combining stacked denoising autoencoder (SDA) and convolutional neural network (CNN) is designed to reconstruct signals. The SDA is used to solve the signal mapping problem, and the signals are initially reconstructed. Furthermore, CNN is used to fully recover the structure and texture features of the image to obtain better reconstruction performance. Extensive experiments show that this framework outperforms many other state-of-the-art methods, especially at low sampling rates.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1863-1703
1863-1711
DOI:10.1007/s11760-021-01955-w