Variable sample-size optimistic mirror descent algorithm for stochastic mixed variational inequalities

In this paper, we propose a variable sample-size optimistic mirror descent algorithm under the Bregman distance for a class of stochastic mixed variational inequalities. Different from those conventional variable sample-size extragradient algorithms to evaluate the expected mapping twice at each ite...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Journal of global optimization Ročník 89; číslo 1; s. 143 - 170
Hlavní autoři: Yang, Zhen-Ping, Zhao, Yong, Lin, Gui-Hua
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York Springer US 01.05.2024
Springer
Springer Nature B.V
Témata:
ISSN:0925-5001, 1573-2916
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:In this paper, we propose a variable sample-size optimistic mirror descent algorithm under the Bregman distance for a class of stochastic mixed variational inequalities. Different from those conventional variable sample-size extragradient algorithms to evaluate the expected mapping twice at each iteration, our algorithm requires only one evaluation of the expected mapping and hence can significantly reduce the computation load. In the monotone case, the proposed algorithm can achieve O ( 1 / t ) ergodic convergence rate in terms of the expected restricted gap function and, under the strongly generalized monotonicity condition, the proposed algorithm has a locally linear convergence rate of the Bregman distance between iterations and solutions when the sample size increases geometrically. Furthermore, we derive some results on stochastic local stability under the generalized monotonicity condition. Numerical experiments indicate that the proposed algorithm compares favorably with some existing methods.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0925-5001
1573-2916
DOI:10.1007/s10898-023-01346-0