Fast hyperspectral single-pixel imaging via frequency-division multiplexed illumination

Hyperspectral imaging that detects 3D spectra-spatial information has been used in a wide range of applications. Among reported techniques, multiplexed spectral imaging with a single-pixel detector provides as a photon-efficient and low-cost implementation; however, the previous spectral modulation...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Optics express Ročník 30; číslo 15; s. 25995
Hlavní autori: Jiang, Xiaoyuan, Li, Ziwei, Du, Gang, Jia, Junlian, Wang, Qinghua, Chi, Nan, Dai, Qionghai
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: 18.07.2022
ISSN:1094-4087, 1094-4087
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Hyperspectral imaging that detects 3D spectra-spatial information has been used in a wide range of applications. Among reported techniques, multiplexed spectral imaging with a single-pixel detector provides as a photon-efficient and low-cost implementation; however, the previous spectral modulation schemes are mostly complicated and sacrifice the imaging speed. Here, we propose a fast and compact hyperspectral single-pixel imaging technique based on programmable chromatic illumination. A multi-wavelength LED array modulated by independent carriers achieves stable and accurate spectral modulation up to MHz in a frequency-division multiplexed manner, hence allowing the full use of the spatial light modulation speed. Additionally, we propose a multi-channel deep convolutional autoencoder network to reconstruct hyperspectral data from highly-compressed 1D measurement. Experimental reconstructions of 12 spectral channels and 64 × 64 pixels are demonstrated for dynamic imaging at 12 fps image rate. The proposed imaging scheme is highly extensible to a wide spectrum range, and holds potential for portable spectral imagers in low-light or scattering applications.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1094-4087
1094-4087
DOI:10.1364/OE.458742