Variational approximation error in non-negative matrix factorization.
Gespeichert in:
| Titel: | Variational approximation error in non-negative matrix factorization. |
|---|---|
| Autoren: | Hayashi N; Simulation & Mining Division, NTT DATA Mathematical Systems Inc., 1F Shinanomachi Rengakan, 35, Shinanomachi, Shinjuku-ku, Tokyo, 160-0016, Japan; Department of Mathematical and Computing Science, Tokyo Institute of Technology, Mail-Box W8-42, 2-12-1, Oookayama, Meguro-ku, Tokyo, 152-8552, Japan. Electronic address: hayashi@msi.co.jp. |
| Quelle: | Neural networks : the official journal of the International Neural Network Society [Neural Netw] 2020 Jun; Vol. 126, pp. 65-75. Date of Electronic Publication: 2020 Mar 13. |
| Publikationsart: | Journal Article |
| Sprache: | English |
| Info zur Zeitschrift: | Publisher: Pergamon Press Country of Publication: United States NLM ID: 8805018 Publication Model: Print-Electronic Cited Medium: Internet ISSN: 1879-2782 (Electronic) Linking ISSN: 08936080 NLM ISO Abbreviation: Neural Netw Subsets: MEDLINE |
| Imprint Name(s): | Original Publication: New York : Pergamon Press, [c1988- |
| MeSH-Schlagworte: | Machine Learning*, Bayes Theorem ; Information Management/methods |
| Abstract: | Non-negative matrix factorization (NMF) is a knowledge discovery method that is used in many fields. Variational inference and Gibbs sampling methods for it are also well-known. However, the variational approximation error has not been clarified yet, because NMF is not statistically regular and the prior distribution used in variational Bayesian NMF (VBNMF) has zero or divergence points. In this paper, using algebraic geometrical methods, we theoretically analyze the difference in negative log evidence (a.k.a. free energy) between VBNMF and Bayesian NMF, i.e., the Kullback-Leibler divergence between the variational posterior and the true posterior. We derive an upper bound for the learning coefficient (a.k.a. the real log canonical threshold) in Bayesian NMF. By using the upper bound, we find a lower bound for the approximation error, asymptotically. The result quantitatively shows how well the VBNMF algorithm can approximate Bayesian NMF; the lower bound depends on the hyperparameters and the true non-negative rank. A numerical experiment demonstrates the theoretical result. (Copyright © 2020 Elsevier Ltd. All rights reserved.) |
| Competing Interests: | Declaration of Competing Interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper. |
| Contributed Indexing: | Keywords: Bayesian inference; Learning coefficient; Non-negative matrix factorization (NMF); Real log canonical threshold (RLCT); Variational Bayesian method; Variational inference |
| Entry Date(s): | Date Created: 20200323 Date Completed: 20200918 Latest Revision: 20200918 |
| Update Code: | 20250114 |
| DOI: | 10.1016/j.neunet.2020.03.009 |
| PMID: | 32200211 |
| Datenbank: | MEDLINE |
Schreiben Sie den ersten Kommentar!
Full Text Finder
Nájsť tento článok vo Web of Science