Accelerating iterative ptychography with an integrated neural network

Electron ptychography is a powerful and versatile tool for high‐resolution and dose‐efficient imaging. Iterative reconstruction algorithms are powerful but also computationally expensive due to their relative complexity and the many hyperparameters that must be optimised. Gradient descent‐based iter...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Journal of microscopy (Oxford) Ročník 300; číslo 2; s. 180 - 190
Hlavní autoři: McCray, Arthur R. C., Ribet, Stephanie M., Varnavides, Georgios, Ophus, Colin
Médium: Journal Article
Jazyk:angličtina
Vydáno: England Wiley Subscription Services, Inc 01.11.2025
Témata:
ISSN:0022-2720, 1365-2818, 1365-2818
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Electron ptychography is a powerful and versatile tool for high‐resolution and dose‐efficient imaging. Iterative reconstruction algorithms are powerful but also computationally expensive due to their relative complexity and the many hyperparameters that must be optimised. Gradient descent‐based iterative ptychography is a popular method, but it may converge slowly when reconstructing low spatial frequencies. In this work, we present a method for accelerating a gradient descent‐based iterative reconstruction algorithm by training a neural network (NN) that is applied in the reconstruction loop. The NN works in Fourier space and selectively boosts low spatial frequencies, thus enabling faster convergence in a manner similar to accelerated gradient descent algorithms. We discuss the difficulties that arise when incorporating a NN into an iterative reconstruction algorithm and show how they can be overcome with iterative training. We apply our method to simulated and experimental data of gold nanoparticles on amorphous carbon and show that we can significantly speed up ptychographic reconstruction of the nanoparticles.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:0022-2720
1365-2818
1365-2818
DOI:10.1111/jmi.13407