Submodular Maximization via Gradient Ascent: The Case of Deep Submodular Functions
We study the problem of maximizing deep submodular functions (DSFs) [13, 3] subject to a matroid constraint. DSFs are an expressive class of submodular functions that include, as strict subfamilies, the facility location, weighted coverage, and sums of concave composed with modular functions. We use...
Gespeichert in:
| Veröffentlicht in: | Advances in neural information processing systems Jg. 2018; S. 7989 |
|---|---|
| Hauptverfasser: | , , |
| Format: | Journal Article |
| Sprache: | Englisch |
| Veröffentlicht: |
United States
01.12.2018
|
| ISSN: | 1049-5258 |
| Online-Zugang: | Weitere Angaben |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Zusammenfassung: | We study the problem of maximizing deep submodular functions (DSFs) [13, 3] subject to a matroid constraint. DSFs are an expressive class of submodular functions that include, as strict subfamilies, the facility location, weighted coverage, and sums of concave composed with modular functions. We use a strategy similar to the continuous greedy approach [6], but we show that the multilinear extension of any DSF has a natural and computationally attainable concave relaxation that we can optimize using gradient ascent. Our results show a guarantee of
with a running time of
(
) plus time for pipage rounding [6] to recover a discrete solution, where
is the rank of the matroid constraint. This bound is often better than the standard 1 - 1
guarantee of the continuous greedy algorithm, but runs much faster. Our bound also holds even for fully curved (
= 1) functions where the guarantee of 1 -
degenerates to 1 - 1
where
is the curvature of
[37]. We perform computational experiments that support our theoretical results. |
|---|---|
| Bibliographie: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
| ISSN: | 1049-5258 |