Unifying Message Passing Algorithms Under the Framework of Constrained Bethe Free Energy Minimization
Variational message passing (VMP), belief propagation (BP) and expectation propagation (EP) have found their wide applications in complex statistical signal processing problems. In addition to viewing them as a class of algorithms operating on graphical models, this article unifies them under an opt...
Uloženo v:
| Vydáno v: | IEEE transactions on wireless communications Ročník 20; číslo 7; s. 4144 - 4158 |
|---|---|
| Hlavní autoři: | , , , , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
New York
IEEE
01.07.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Témata: | |
| ISSN: | 1536-1276, 1558-2248 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | Variational message passing (VMP), belief propagation (BP) and expectation propagation (EP) have found their wide applications in complex statistical signal processing problems. In addition to viewing them as a class of algorithms operating on graphical models, this article unifies them under an optimization framework, namely, Bethe free energy minimization with differently and appropriately imposed constraints. This new perspective in terms of constraint manipulation can offer additional insights on the connection between different message passing algorithms and is valid for a generic statistical model. It also founds a theoretical framework to systematically derive message passing variants. Taking the sparse signal recovery (SSR) problem as an example, a low-complexity EP variant can be obtained by simple constraint reformulation, delivering better estimation performance with lower complexity than the standard EP algorithm. Furthermore, we can resort to the framework for the systematic derivation of hybrid message passing for complex inference tasks. Notably, a hybrid message passing algorithm is exemplarily derived for joint SSR and statistical model learning with near-optimal inference performance and scalable complexity. |
|---|---|
| Bibliografie: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 1536-1276 1558-2248 |
| DOI: | 10.1109/TWC.2021.3056193 |