Adaptive optics for array telescopes using neural-network techniques

IMAGES formed by ground-based telescopes are marred by atmospheric 'seeing'. The plane wavefront from an unresolved star is distorted by continually changing turbulent fluctuations in the air's refractive index. Diffraction-limited performance can in principle be recovered through the...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Nature (London) Jg. 348; H. 6298; S. 221 - 224
Hauptverfasser: Angel, J. R. P., Wizinowich, P., Lloyd-Hart, M., Sandler, D.
Format: Journal Article
Sprache:Englisch
Veröffentlicht: London Nature Publishing 15.11.1990
Nature Publishing Group
Schlagworte:
ISSN:0028-0836, 1476-4687
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:IMAGES formed by ground-based telescopes are marred by atmospheric 'seeing'. The plane wavefront from an unresolved star is distorted by continually changing turbulent fluctuations in the air's refractive index. Diffraction-limited performance can in principle be recovered through the methods of adaptive optics, in which the instantaneous wavefront shape is sensed and corrected in real-time by deformable optics that cancel the distortion1'2. The highest resolution will be achieved when this technique is applied to multiple-telescope arrays. For such arrays, the biggest errors caused by seeing at infrared wavelengths are the variations in pathlength and wavefront tilt between array elements. We show here that these errors can be derived by an artificial neural network, given only a pair of simultaneous in-focus and out-of-focus images of a reference star formed at the combined focus of all the array elements. We have optimized a neural network appropriate for 2.2-p.m wavelength imaging at the Multiple Mirror Telescope in Arizona. Corrections made by moving the beam-combining mirrors will largely recover the diffraction-limited profile, with a resolution of 0.06 arcsec.
Bibliographie:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
ObjectType-Article-2
content type line 23
ISSN:0028-0836
1476-4687
DOI:10.1038/348221a0