An accelerated forward-backward algorithm with a new linesearch for convex minimization problems and its applications

We study and investigate a convex minimization problem of the sum of two convex functions in the setting of a Hilbert space. By assuming the Lipschitz continuity of the gradient of the function, many optimization methods have been invented, where the stepsizes of those algorithms depend on the Lipsc...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:AIMS mathematics Ročník 6; číslo 6; s. 6180 - 6200
Hlavní autoři: Hanjing, Adisak, Jailoka, Pachara, Suantai, Suthep
Médium: Journal Article
Jazyk:angličtina
Vydáno: AIMS Press 01.01.2021
Témata:
ISSN:2473-6988, 2473-6988
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:We study and investigate a convex minimization problem of the sum of two convex functions in the setting of a Hilbert space. By assuming the Lipschitz continuity of the gradient of the function, many optimization methods have been invented, where the stepsizes of those algorithms depend on the Lipschitz constant. However, finding such a Lipschitz constant is not an easy task in general practice. In this work, by using a new modification of the linesearches of Cruz and Nghia [7] and Kankam et al. [14] and an inertial technique, we introduce an accelerated algorithm without any Lipschitz continuity assumption on the gradient. Subsequently, a weak convergence result of the proposed method is established. As applications, we apply and analyze our method for solving an image restoration problem and a regression problem. Numerical experiments show that our method has a higher efficiency than the well-known methods in the literature.
ISSN:2473-6988
2473-6988
DOI:10.3934/math.2021363