Benchmarking Neural Networks For Quantum Computations

The power of quantum computers is still somewhat speculative. Although they are certainly faster than classical ones at some tasks, the class of problems they can efficiently solve has not been mapped definitively onto known classical complexity theory. This means that we do not know for which calcu...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transaction on neural networks and learning systems Ročník 31; číslo 7; s. 2522 - 2531
Hlavní autoři: Nguyen, Nam H., Behrman, E. C., Moustafa, Mohamed A., Steck, J. E.
Médium: Journal Article
Jazyk:angličtina
Vydáno: United States IEEE 01.07.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:2162-237X, 2162-2388, 2162-2388
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:The power of quantum computers is still somewhat speculative. Although they are certainly faster than classical ones at some tasks, the class of problems they can efficiently solve has not been mapped definitively onto known classical complexity theory. This means that we do not know for which calculations there will be a "quantum advantage," once an algorithm is found. One way to answer the question is to find those algorithms, but finding truly quantum algorithms turns out to be very difficult. In previous work, over the past three decades, we have pursued the idea of using techniques of machine learning to develop algorithms for quantum computing. Here, we compare the performance of standard real- and complex-valued classical neural networks with that of one of our models for a quantum neural network, on both classical problems and on an archetypal quantum problem: the computation of an entanglement witness. The quantum network is shown to need far fewer epochs and a much smaller network to achieve comparable or better results.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2162-237X
2162-2388
2162-2388
DOI:10.1109/TNNLS.2019.2933394