An accelerated forward-backward algorithm with a new linesearch for convex minimization problems and its applications

We study and investigate a convex minimization problem of the sum of two convex functions in the setting of a Hilbert space. By assuming the Lipschitz continuity of the gradient of the function, many optimization methods have been invented, where the stepsizes of those algorithms depend on the Lipsc...

Full description

Saved in:
Bibliographic Details
Published in:AIMS mathematics Vol. 6; no. 6; pp. 6180 - 6200
Main Authors: Hanjing, Adisak, Jailoka, Pachara, Suantai, Suthep
Format: Journal Article
Language:English
Published: AIMS Press 01.01.2021
Subjects:
ISSN:2473-6988, 2473-6988
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We study and investigate a convex minimization problem of the sum of two convex functions in the setting of a Hilbert space. By assuming the Lipschitz continuity of the gradient of the function, many optimization methods have been invented, where the stepsizes of those algorithms depend on the Lipschitz constant. However, finding such a Lipschitz constant is not an easy task in general practice. In this work, by using a new modification of the linesearches of Cruz and Nghia [7] and Kankam et al. [14] and an inertial technique, we introduce an accelerated algorithm without any Lipschitz continuity assumption on the gradient. Subsequently, a weak convergence result of the proposed method is established. As applications, we apply and analyze our method for solving an image restoration problem and a regression problem. Numerical experiments show that our method has a higher efficiency than the well-known methods in the literature.
ISSN:2473-6988
2473-6988
DOI:10.3934/math.2021363