Výsledky vyhľadávania - "stochastic variance-reduced algorithm"

  • Zobrazené výsledky 1 - 18 z 18
Upresniť hľadanie
  1. 1

    Faster First-Order Methods for Stochastic Non-Convex Optimization on Riemannian Manifolds Autor Zhou, Pan, Yuan, Xiao-Tong, Yan, Shuicheng, Feng, Jiashi

    ISSN: 0162-8828, 1939-3539, 2160-9292, 1939-3539
    Vydavateľské údaje: United States IEEE 01.02.2021
    “…First-order non-convex Riemannian optimization algorithms have gained recent popularity in structured machine learning problems including principal component…”
    Získať plný text
    Journal Article
  2. 2

    A Hybrid Stochastic-Deterministic Minibatch Proximal Gradient Method for Efficient Optimization and Generalization Autor Zhou, Pan, Yuan, Xiao-Tong, Lin, Zhouchen, Hoi, Steven C.H.

    ISSN: 0162-8828, 1939-3539, 2160-9292, 1939-3539
    Vydavateľské údaje: United States IEEE 01.10.2022
    “…Despite the success of stochastic variance-reduced gradient (SVRG) algorithms in solving large-scale problems, their stochastic gradient complexity often…”
    Získať plný text
    Journal Article
  3. 3

    A Simple Stochastic Variance Reduced Algorithm with Fast Convergence Rates Autor Zhou, Kaiwen, Shang, Fanhua, Cheng, James

    ISSN: 2331-8422
    Vydavateľské údaje: Ithaca Cornell University Library, arXiv.org 28.06.2018
    Vydané v arXiv.org (28.06.2018)
    “…, sparse and asynchronous) due to the existence of perturbation. In this paper, we introduce a simple stochastic variance reduced algorithm (MiG…”
    Získať plný text
    Paper
  4. 4

    Accelerated variance-reduced methods for saddle-point problems Autor Borodich, Ekaterina, Tominin, Vladislav, Tominin, Yaroslav, Kovalev, Dmitry, Gasnikov, Alexander, Dvurechensky, Pavel

    ISSN: 2192-4406
    Vydavateľské údaje: Elsevier Ltd 2022
    “… for the primal and dual variables. For such problems, under the average-smoothness assumption, we propose accelerated stochastic variance-reduced algorithms with optimal up to logarithmic factors complexity bounds…”
    Získať plný text
    Journal Article
  5. 5

    Byzantine Resilient Non-Convex SCSG With Distributed Batch Gradient Computations Autor Bulusu, Saikiran, Khanduri, Prashant, Kafle, Swatantra, Sharma, Pranay, Varshney, Pramod K.

    ISSN: 2373-776X, 2373-7778
    Vydavateľské údaje: Piscataway IEEE 2021
    “… A robust variant of the stochastic variance-reduced algorithm is proposed. In the distributed setup, we assume that a fraction of worker nodes (WNs) can be Byzantines…”
    Získať plný text
    Journal Article
  6. 6

    Stochastic Scale Invariant Power Iteration for KL-divergence Nonnegative Matrix Factorization Autor Kim, Cheolmin, Kim, Youngseok, Jambunath, Yegna Subramanian, Klabjan, Diego

    ISSN: 2573-2978
    Vydavateľské údaje: IEEE 15.12.2024
    “…We introduce a mini-batch stochastic variance-reduced algorithm to solve finite-sum scale invariant problems which cover several examples in machine learning and statistics such as principal component analysis (PCA…”
    Získať plný text
    Konferenčný príspevok..
  7. 7

    Distributionally Adversarial Learning Autor Shi, Zhan

    ISBN: 9798494421050
    Vydavateľské údaje: ProQuest Dissertations & Theses 01.01.2021
    “… Then we develop a new stochastic variance-reduced algorithm to efficiently solve them, which allows any Bregman divergence as a proximal function and achieves linear convergence rates…”
    Získať plný text
    Dissertation
  8. 8

    Variance-reduction for Variational Inequality Problems with Bregman Distance Function Autor Alizadeh, Zeinab, Erfan Yazdandoost Hamedani, Jalilzadeh, Afrooz

    ISSN: 2331-8422
    Vydavateľské údaje: Ithaca Cornell University Library, arXiv.org 17.05.2024
    Vydané v arXiv.org (17.05.2024)
    “… We introduce a novel single-loop stochastic variance-reduced algorithm, incorporating the Bregman distance function, and establish an optimal convergence guarantee under a monotone setting…”
    Získať plný text
    Paper
  9. 9

    Stochastic Scale Invariant Power Iteration for KL-divergence Nonnegative Matrix Factorization Autor Kim, Cheolmin, Kim, Youngseok, Klabjan, Diego

    ISSN: 2331-8422
    Vydavateľské údaje: Ithaca Cornell University Library, arXiv.org 21.04.2023
    Vydané v arXiv.org (21.04.2023)
    “…We introduce a mini-batch stochastic variance-reduced algorithm to solve finite-sum scale invariant problems which cover several examples in machine learning and statistics such as principal component analysis (PCA…”
    Získať plný text
    Paper
  10. 10

    Stochastic variance reduced gradient with hyper-gradient for non-convex large-scale learning Autor Yang, Zhuang

    ISSN: 0924-669X, 1573-7497
    Vydavateľské údaje: New York Springer US 01.12.2023
    “… With faster convergence rate, there have been tremendous studies on developing stochastic variance reduced algorithms to solve these non-convex optimization problems…”
    Získať plný text
    Journal Article
  11. 11

    Convergence of Distributed Stochastic Variance Reduced Methods Without Sampling Extra Data Autor Cen, Shicong, Zhang, Huishuai, Chi, Yuejie, Chen, Wei, Liu, Tie-Yan

    ISSN: 1053-587X, 1941-0476
    Vydavateľské údaje: New York IEEE 2020
    “…Stochastic variance reduced methods have gained a lot of interest recently for empirical risk minimization due to its appealing run time complexity. When the…”
    Získať plný text
    Journal Article
  12. 12

    Convergence of Distributed Stochastic Variance Reduced Methods without Sampling Extra Data Autor Cen, Shicong, Zhang, Huishuai, Chi, Yuejie, Chen, Wei, Tie-Yan, Liu

    ISSN: 2331-8422
    Vydavateľské údaje: Ithaca Cornell University Library, arXiv.org 09.07.2020
    Vydané v arXiv.org (09.07.2020)
    “…Stochastic variance reduced methods have gained a lot of interest recently for empirical risk minimization due to its appealing run time complexity. When the…”
    Získať plný text
    Paper
  13. 13

    Complexity Theory of Stochastic Algorithms for Large-Scale Nonconvex Optimization Autor Wang, Zhe

    ISBN: 9798516074769
    Vydavateľské údaje: ProQuest Dissertations & Theses 01.01.2020
    “… For the first-order algorithms, SARAH and SPIDER are two recently developed stochastic variance reduced algorithms, and SPIDER has been shown to achieve a near-optimal first-order oracle complexity…”
    Získať plný text
    Dissertation
  14. 14

    Momentum Schemes with Stochastic Variance Reduction for Nonconvex Composite Optimization Autor Zhou, Yi, Wang, Zhe, Ji, Kaiyi, Liang, Yingbin, Tarokh, Vahid

    ISSN: 2331-8422
    Vydavateľské údaje: Ithaca Cornell University Library, arXiv.org 15.05.2019
    Vydané v arXiv.org (15.05.2019)
    “…Two new stochastic variance-reduced algorithms named SARAH and SPIDER have been recently proposed, and SPIDER has been shown to achieve a near-optimal gradient oracle complexity for nonconvex optimization…”
    Získať plný text
    Paper
  15. 15

    Stochastic Variance Reduction for Variational Inequality Methods Autor Alacaoglu, Ahmet, Malitsky, Yura

    ISSN: 2331-8422
    Vydavateľské údaje: Ithaca Cornell University Library, arXiv.org 11.06.2022
    Vydané v arXiv.org (11.06.2022)
    “…We propose stochastic variance reduced algorithms for solving convex-concave saddle point problems, monotone variational inequalities, and monotone inclusions…”
    Získať plný text
    Paper
  16. 16

    An Accelerated Variance Reduced Extra-Point Approach to Finite-Sum VI and Optimization Autor Huang, Kevin, Wang, Nuozhou, Zhang, Shuzhong

    ISSN: 2331-8422
    Vydavateľské údaje: Ithaca Cornell University Library, arXiv.org 07.11.2022
    Vydané v arXiv.org (07.11.2022)
    “…In this paper, we develop stochastic variance reduced algorithms for solving a class of finite-sum monotone VI, where the operator consists of the sum of finitely many monotone VI mappings and the sum…”
    Získať plný text
    Paper
  17. 17

    Dual-Free Stochastic Decentralized Optimization with Variance Reduction Autor Hendrikx, Hadrien, Bach, Francis, Massoulié, Laurent

    ISSN: 2331-8422
    Vydavateľské údaje: Ithaca Cornell University Library, arXiv.org 25.06.2020
    Vydané v arXiv.org (25.06.2020)
    “… DVR only requires computing stochastic gradients of the local functions, and is computationally as fast as a standard stochastic variance-reduced algorithms run on a \(1/n…”
    Získať plný text
    Paper
  18. 18

    SpiderBoost and Momentum: Faster Stochastic Variance Reduction Algorithms Autor Wang, Zhe, Ji, Kaiyi, Zhou, Yi, Liang, Yingbin, Tarokh, Vahid

    ISSN: 2331-8422
    Vydavateľské údaje: Ithaca Cornell University Library, arXiv.org 15.05.2020
    Vydané v arXiv.org (15.05.2020)
    “…SARAH and SPIDER are two recently developed stochastic variance-reduced algorithms, and SPIDER has been shown to achieve a near-optimal first-order oracle complexity in smooth nonconvex optimization…”
    Získať plný text
    Paper