Výsledky vyhľadávania - "stochastic variance-reduced algorithm"
-
1
Faster First-Order Methods for Stochastic Non-Convex Optimization on Riemannian Manifolds
ISSN: 0162-8828, 1939-3539, 2160-9292, 1939-3539Vydavateľské údaje: United States IEEE 01.02.2021Vydané v IEEE transactions on pattern analysis and machine intelligence (01.02.2021)“…First-order non-convex Riemannian optimization algorithms have gained recent popularity in structured machine learning problems including principal component…”
Získať plný text
Journal Article -
2
A Hybrid Stochastic-Deterministic Minibatch Proximal Gradient Method for Efficient Optimization and Generalization
ISSN: 0162-8828, 1939-3539, 2160-9292, 1939-3539Vydavateľské údaje: United States IEEE 01.10.2022Vydané v IEEE transactions on pattern analysis and machine intelligence (01.10.2022)“…Despite the success of stochastic variance-reduced gradient (SVRG) algorithms in solving large-scale problems, their stochastic gradient complexity often…”
Získať plný text
Journal Article -
3
A Simple Stochastic Variance Reduced Algorithm with Fast Convergence Rates
ISSN: 2331-8422Vydavateľské údaje: Ithaca Cornell University Library, arXiv.org 28.06.2018Vydané v arXiv.org (28.06.2018)“…, sparse and asynchronous) due to the existence of perturbation. In this paper, we introduce a simple stochastic variance reduced algorithm (MiG…”
Získať plný text
Paper -
4
Accelerated variance-reduced methods for saddle-point problems
ISSN: 2192-4406Vydavateľské údaje: Elsevier Ltd 2022Vydané v EURO journal on computational optimization (2022)“… for the primal and dual variables. For such problems, under the average-smoothness assumption, we propose accelerated stochastic variance-reduced algorithms with optimal up to logarithmic factors complexity bounds…”
Získať plný text
Journal Article -
5
Byzantine Resilient Non-Convex SCSG With Distributed Batch Gradient Computations
ISSN: 2373-776X, 2373-7778“… A robust variant of the stochastic variance-reduced algorithm is proposed. In the distributed setup, we assume that a fraction of worker nodes (WNs) can be Byzantines…”Vydavateľské údaje: Piscataway IEEE 2021
Získať plný text
Journal Article -
6
Stochastic Scale Invariant Power Iteration for KL-divergence Nonnegative Matrix Factorization
ISSN: 2573-2978Vydavateľské údaje: IEEE 15.12.2024Vydané v IEEE International Conference on Big Data (15.12.2024)“…We introduce a mini-batch stochastic variance-reduced algorithm to solve finite-sum scale invariant problems which cover several examples in machine learning and statistics such as principal component analysis (PCA…”
Získať plný text
Konferenčný príspevok.. -
7
Distributionally Adversarial Learning
ISBN: 9798494421050Vydavateľské údaje: ProQuest Dissertations & Theses 01.01.2021“… Then we develop a new stochastic variance-reduced algorithm to efficiently solve them, which allows any Bregman divergence as a proximal function and achieves linear convergence rates…”
Získať plný text
Dissertation -
8
Variance-reduction for Variational Inequality Problems with Bregman Distance Function
ISSN: 2331-8422Vydavateľské údaje: Ithaca Cornell University Library, arXiv.org 17.05.2024Vydané v arXiv.org (17.05.2024)“… We introduce a novel single-loop stochastic variance-reduced algorithm, incorporating the Bregman distance function, and establish an optimal convergence guarantee under a monotone setting…”
Získať plný text
Paper -
9
Stochastic Scale Invariant Power Iteration for KL-divergence Nonnegative Matrix Factorization
ISSN: 2331-8422Vydavateľské údaje: Ithaca Cornell University Library, arXiv.org 21.04.2023Vydané v arXiv.org (21.04.2023)“…We introduce a mini-batch stochastic variance-reduced algorithm to solve finite-sum scale invariant problems which cover several examples in machine learning and statistics such as principal component analysis (PCA…”
Získať plný text
Paper -
10
Stochastic variance reduced gradient with hyper-gradient for non-convex large-scale learning
ISSN: 0924-669X, 1573-7497Vydavateľské údaje: New York Springer US 01.12.2023Vydané v Applied intelligence (Dordrecht, Netherlands) (01.12.2023)“… With faster convergence rate, there have been tremendous studies on developing stochastic variance reduced algorithms to solve these non-convex optimization problems…”
Získať plný text
Journal Article -
11
Convergence of Distributed Stochastic Variance Reduced Methods Without Sampling Extra Data
ISSN: 1053-587X, 1941-0476Vydavateľské údaje: New York IEEE 2020Vydané v IEEE transactions on signal processing (2020)“…Stochastic variance reduced methods have gained a lot of interest recently for empirical risk minimization due to its appealing run time complexity. When the…”
Získať plný text
Journal Article -
12
Convergence of Distributed Stochastic Variance Reduced Methods without Sampling Extra Data
ISSN: 2331-8422Vydavateľské údaje: Ithaca Cornell University Library, arXiv.org 09.07.2020Vydané v arXiv.org (09.07.2020)“…Stochastic variance reduced methods have gained a lot of interest recently for empirical risk minimization due to its appealing run time complexity. When the…”
Získať plný text
Paper -
13
Complexity Theory of Stochastic Algorithms for Large-Scale Nonconvex Optimization
ISBN: 9798516074769Vydavateľské údaje: ProQuest Dissertations & Theses 01.01.2020“… For the first-order algorithms, SARAH and SPIDER are two recently developed stochastic variance reduced algorithms, and SPIDER has been shown to achieve a near-optimal first-order oracle complexity…”
Získať plný text
Dissertation -
14
Momentum Schemes with Stochastic Variance Reduction for Nonconvex Composite Optimization
ISSN: 2331-8422Vydavateľské údaje: Ithaca Cornell University Library, arXiv.org 15.05.2019Vydané v arXiv.org (15.05.2019)“…Two new stochastic variance-reduced algorithms named SARAH and SPIDER have been recently proposed, and SPIDER has been shown to achieve a near-optimal gradient oracle complexity for nonconvex optimization…”
Získať plný text
Paper -
15
Stochastic Variance Reduction for Variational Inequality Methods
ISSN: 2331-8422Vydavateľské údaje: Ithaca Cornell University Library, arXiv.org 11.06.2022Vydané v arXiv.org (11.06.2022)“…We propose stochastic variance reduced algorithms for solving convex-concave saddle point problems, monotone variational inequalities, and monotone inclusions…”
Získať plný text
Paper -
16
An Accelerated Variance Reduced Extra-Point Approach to Finite-Sum VI and Optimization
ISSN: 2331-8422Vydavateľské údaje: Ithaca Cornell University Library, arXiv.org 07.11.2022Vydané v arXiv.org (07.11.2022)“…In this paper, we develop stochastic variance reduced algorithms for solving a class of finite-sum monotone VI, where the operator consists of the sum of finitely many monotone VI mappings and the sum…”
Získať plný text
Paper -
17
Dual-Free Stochastic Decentralized Optimization with Variance Reduction
ISSN: 2331-8422Vydavateľské údaje: Ithaca Cornell University Library, arXiv.org 25.06.2020Vydané v arXiv.org (25.06.2020)“… DVR only requires computing stochastic gradients of the local functions, and is computationally as fast as a standard stochastic variance-reduced algorithms run on a \(1/n…”
Získať plný text
Paper -
18
SpiderBoost and Momentum: Faster Stochastic Variance Reduction Algorithms
ISSN: 2331-8422Vydavateľské údaje: Ithaca Cornell University Library, arXiv.org 15.05.2020Vydané v arXiv.org (15.05.2020)“…SARAH and SPIDER are two recently developed stochastic variance-reduced algorithms, and SPIDER has been shown to achieve a near-optimal first-order oracle complexity in smooth nonconvex optimization…”
Získať plný text
Paper

