VA & Opt Webinar: Huynh Van Ngai
Title: Generalized Nesterov’s accelerated proximal gradient algorithms with convergence rate of order o(1/k2)
Speaker: Huynh Van Ngai (University of Quy Nhon)
Date and Time: March 24th, 2021, 17:00 AEDT (Register here for remote connection via Zoom)
Abstract: The accelerated gradient method initiated by Nesterov is now recognized to be one of the most powerful tools for solving smooth convex optimization problems. This method improves significantly the convergence rate of function values from O(1/k) of the standard gradient method down to O(1/k2). In this paper, we present two generalized variants of Nesterov’s accelerated proximal gradient method for solving composition convex optimization problems in which the objective function is represented by the sum of a smooth convex function and a nonsmooth convex part. We show that with suitable ways to pick the sequences of parameters, the convergence rate for the function values of this proposed method is actually of order o(1/k2). Especially, when the objective function is p-uniformly convex for p>2, the convergence rate is of order O(\ln k/k^{2p/(p-2)}), and the convergence is linear if the objective function is strongly convex. By-product, we derive a forward-backward algorithm generalizing the one by Attouch-Peypouquet [SIAM J. Optim., 26(3), 1824-1834, (2016)], which produces a convergence sequence with a convergence rate of the function values of order o(1/k2).