Benchmarking Neural Networks For Quantum Computations

The power of quantum computers is still somewhat speculative. Although they are certainly faster than classical ones at some tasks, the class of problems they can efficiently solve has not been mapped definitively onto known classical complexity theory. This means that we do not know for which calcu...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transaction on neural networks and learning systems Jg. 31; H. 7; S. 2522 - 2531
Hauptverfasser: Nguyen, Nam H., Behrman, E. C., Moustafa, Mohamed A., Steck, J. E.
Format: Journal Article
Sprache:Englisch
Veröffentlicht: United States IEEE 01.07.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Schlagworte:
ISSN:2162-237X, 2162-2388, 2162-2388
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The power of quantum computers is still somewhat speculative. Although they are certainly faster than classical ones at some tasks, the class of problems they can efficiently solve has not been mapped definitively onto known classical complexity theory. This means that we do not know for which calculations there will be a "quantum advantage," once an algorithm is found. One way to answer the question is to find those algorithms, but finding truly quantum algorithms turns out to be very difficult. In previous work, over the past three decades, we have pursued the idea of using techniques of machine learning to develop algorithms for quantum computing. Here, we compare the performance of standard real- and complex-valued classical neural networks with that of one of our models for a quantum neural network, on both classical problems and on an archetypal quantum problem: the computation of an entanglement witness. The quantum network is shown to need far fewer epochs and a much smaller network to achieve comparable or better results.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2162-237X
2162-2388
2162-2388
DOI:10.1109/TNNLS.2019.2933394