VA & Opt Webinar: Yura Malitsky

Title: Adaptive gradient descent without descent

Speaker: Yura Malitsky (Linköping University)

Date and Time: May 19th, 2021, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: In this talk I will present some recent results for the most classical optimization method — gradient descent. We will show that a simple zero cost rule is sufficient to completely automate gradient descent. The method adapts to the local geometry, with convergence guarantees depending only on the smoothness in a neighborhood of a solution. The presentation is based on a joint work with K. Mishchenko, see arxiv.org/abs/1910.09529.

VA & Opt Webinar: Hung Phan

Title: Adaptive splitting algorithms for the sum of operators

Speaker: Hung Phan (University of Massachusetts Lowell)

Date and Time: May 12th, 2021, 11:00 AEST (Register here for remote connection via Zoom)

Abstract: A general optimization problem can often be reduced to finding a zero of a sum of multiple (maximally) monotone operators, which creates challenging computational tasks as a whole. It motivates the development of splitting algorithms in order to simplify the computations by dealing with each operator separately, hence the name “splitting”. Some of the most successful splitting algorithms in applications are the forward-backward algorithm, the Douglas-Rachford algorithm, and the alternating directions method of multipliers (ADMM). In this talk, we discuss some adaptive splitting algorithms for finding a zero of the sum of operators. The main idea is to adapt the algorithm parameters to the generalized monotonicity of the operators so that the generated sequence converges to a fixed point.

VA & Opt Webinar: Lyudmila Polyakova

Title: Smooth approximations of D.C. functions

Speaker: Lyudmila Polyakova (Saint-Petersburg State University)

Date and Time: May 5th, 2021, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: An investigation of properties of difference of convex functions is based on the basic facts and theorems of convex analysis, as the class of convex functions is one of the most investigated among nonsmooth functions. For an arbitrary convex function a family of continuously differentiable approximations is constructed using the infimal convolution operation. If the domain of the considered function is compact then such smooth convex approximations are uniform in the Chebyshev metric. Using this technique a smooth approximation is constructed for the d.c. functions. The optimization properties of these approximations are studied.

VA & Opt Webinar: Jiri Outrata

Title: On the solution of static contact problems with Coulomb friction via the semismooth* Newton method

Speaker: Jiri Outrata (Institute of Information Theory and Automation of the Czech Academy of Sciences)

Date and Time: April 28th, 2021, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: The lecture deals with application of a new Newton-type method to the numerical solution of discrete 3D contact problems with Coulomb friction. This method suits well to the solution of inclusions and the resulting conceptual algorithm exhibits, under appropriate conditions, the local superlinear convergence. After a description of the method a new model for the considered contact problem, amenable to the application of the new method, will be presented. The second part of the talk is then devoted to an efficient implementation of the general algorithm and to numerical tests. Throughout the whole lecture, various tools of modern variational analysis will be employed.

Postdoc position and PhD position: Optimization for Machine Learning (Linköping Univ)

Postdoc position Optimization for Machine Learning (Linköping Univ)
The department of Mathematics at Linkoping University seeks applications for a Postdoc position in the area of mathematical optimization theory with the aim of developing new techniques for machine learning. The position is for 2 years.

The doctoral student will be supervised by Dr. Oleg Burdakov and Dr. Yura Malitsky  

The application deadline is May 21, 2021.  

The full job ad: https://liu.se/en/work-at-liu/vacancies?rmpage=job&rmjob=16145&rmlang=UK

PhD student in Optimization for Machine Learning (Linköping Univ)
The department of Mathematics at Linkoping University seeks applications for a PhD student position in the area of mathematical optimization theory with the aim of developing new techniques for machine learning. The position is for 4 years without teaching or 5 years with teaching. 

The doctoral student will be supervised by Dr. Oleg Burdakov and
Dr. Yura Malitsky 

The application deadline is May 7, 2021. 

VA & Opt Webinar: Ewa Bednarczuk

Title: On duality for nonconvex minimization problems within the framework of abstract convexity

Speaker: Ewa Bednarczuk (Warsaw University of Technology and Systems Research Institute of the PAS)

Date and Time: April 21st, 2021, 17:00 AEST (Register here for remote connection via Zoom)

VA & Opt Webinar: Roger Behling

Title: Circumcentering projection type methods

Speaker: Roger Behling (Fundação Getúlio Vargas)

Date and Time: April 14th, 2021, 11:00 AEST (Register here for remote connection via Zoom)

Abstract: Enforcing successive projections, averaging the composition of reflections and barycentering projections are settled techniques for solving convex feasibility problems. These schemes are called the method of alternating projections (MAP), the Douglas-Rachfort method (DRM) and the Cimmino method (CimM), respectively. Recently, we have developed the circumcentered-reflection method (CRM), whose iterations employ generalized circumcenters that are able to accelerate the aforementioned classical approaches both theoretically and numerically. In this talk, the main results on CRM are presented and a glimpse on future work will be provided as well.

VA & Opt Webinar: Russell Luke

Title: Inconsistent Stochastic Feasibility: the Case of Stochastic Tomography

Speaker: Russell Luke (University of Göttingen)

Date and Time: April 7th, 2021, 17:00 AEST (Register here for remote connection via Zoom)

In an X-FEL experiment, high-energy x-ray pulses are shot with high repetition rates on a stream of identical single biomolecules and the scattered photons are recorded on a pixelized detector. These experiments provide a new and unique route to macromolecular structure determination at room temperature, without the need for crystallization, and at low material usage. The main challenges in these experiments are the extremely low signal-to-noise ratio due to the very low expected photon count per scattering image (10-50) and the unknown orientation of the molecules in each scattering image.

Mathematically, this is a stochastic computed tomography problem where the goal is to reconstruct a three-dimensional object from noisy two-dimensional images of a nonlinear mapping whose orientation relative to the object is both random and unobservable. The idea is to develop of a two-step procedure for solving this problem. In the first step, we numerically compute a probability distribution associated with the observed patterns (taken together) as the stationary measure of a Markov chain whose generator is constructed from the individual observations. Correlation in the data and other a priori information is used to further constrain the problem and accelerate convergence to a stationary measure. With the stationary measure in hand, the second step involves solving a phase retrieval problem for the mean electron density relative to a fixed reference orientation.

The focus of this talk is conceptual, and involves re-envisioning projection algorithms as Markov chains. We already present some new routes to “old” results, and a fundamental new approach to understanding and accounting for numerical computation on conventional computers.

VA & Opt Webinar: Yboon Garcia Ramos

Title: Characterizing quasiconvexity of the pointwise infimum of a family of arbitrary translations of quasiconvex functions

Speaker: Yboon Garcia Ramos (Universidad del Pacífico)

Date and Time: March 31st, 2021, 11:00 AEDT (Register here for remote connection via Zoom)

Abstract: In this talk we will present some results concerning the problem of preserving quasiconvexity when summing up quasiconvex functions and we will relate it to the problem of preserving quasiconvexity when taking the infimum of a family of quasiconvex functions. To develop our study, the notion of quasiconvex family is introduced, and we establish various characterizations of such a concept.

Joint work with Fabián Flores, Universidad de Concepción and Nicolas Hadjisavvas, University of the Aegean.

VA & Opt Webinar: Huynh Van Ngai

Title: Generalized Nesterov’s accelerated proximal gradient algorithms with convergence rate of order o(1/k2)

Speaker: Huynh Van Ngai (University of Quy Nhon)

Date and Time: March 24th, 2021, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: The accelerated gradient method initiated by Nesterov is now recognized to be one of the most powerful tools for solving smooth convex optimization problems. This method improves significantly the convergence rate of function values from O(1/k) of the standard gradient method down to O(1/k2). In this paper, we present two generalized variants of Nesterov’s accelerated proximal gradient method for solving composition convex optimization problems in which the objective function is represented by the sum of a smooth convex function and a nonsmooth convex part. We show that with suitable ways to pick the sequences of parameters, the convergence rate for the function values of this proposed method is actually of order o(1/k2). Especially, when the objective function is p-uniformly convex for p>2, the convergence rate is of order O(\ln k/k^{2p/(p-2)}), and the convergence is linear if the objective function is strongly convex. By-product, we derive a forward-backward algorithm generalizing the one by Attouch-Peypouquet [SIAM J. Optim., 26(3), 1824-1834, (2016)], which produces a convergence sequence with a convergence rate of the function values of order o(1/k2).

1 12 13 14 15 16 25