Generative quantum learning of joint probability distribution functions

Modeling joint probability distributions is an important task in a wide variety of fields. One popular technique for this employs a family of multivariate distributions with uniform marginals called copulas. While the theory of modeling joint distributions via copulas is well understood, it gets pra...

Full description

Saved in:
Bibliographic Details
Published in:Physical review research Vol. 4; no. 4; p. 043092
Main Authors: Zhu, Elton Yechao, Johri, Sonika, Bacon, Dave, Esencan, Mert, Kim, Jungsang, Muir, Mark, Murgai, Nikhil, Nguyen, Jason, Pisenti, Neal, Schouela, Adam, Sosnova, Ksenia, Wright, Ken
Format: Journal Article
Language:English
Published: American Physical Society 01.11.2022
ISSN:2643-1564, 2643-1564
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Modeling joint probability distributions is an important task in a wide variety of fields. One popular technique for this employs a family of multivariate distributions with uniform marginals called copulas. While the theory of modeling joint distributions via copulas is well understood, it gets practically challenging to accurately model real data with many variables. In this paper, we show that any copula can be naturally mapped to a multipartite maximally entangled state. Thus, the task of learning joint probability distributions becomes the task of learning maximally entangled states. We prove that a variational ansatz we christen as a “qopula” based on this insight leads to an exponential advantage over classical methods of learning some joint distributions. As an application, we train a quantum generative adversarial network (QGAN) and a quantum circuit Born machine (QCBM) using this variational ansatz to generate samples from joint distributions of two variables in historical data from the stock market. We demonstrate our generative learning algorithms on trapped ion quantum computers from IonQ for up to eight qubits. Our experimental results show interesting findings such as the resilience against noise, outperformance against equivalent classical models and 20–1000 times less iterations required to converge as compared to equivalent classical models.
ISSN:2643-1564
2643-1564
DOI:10.1103/PhysRevResearch.4.043092