VA & Opt Webinar: Lyudmila Polyakova

Title: Smooth approximations of D.C. functions

Speaker: Lyudmila Polyakova (Saint-Petersburg State University)

Date and Time: May 5th, 2021, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: An investigation of properties of difference of convex functions is based on the basic facts and theorems of convex analysis, as the class of convex functions is one of the most investigated among nonsmooth functions. For an arbitrary convex function a family of continuously differentiable approximations is constructed using the infimal convolution operation. If the domain of the considered function is compact then such smooth convex approximations are uniform in the Chebyshev metric. Using this technique a smooth approximation is constructed for the d.c. functions. The optimization properties of these approximations are studied.

VA & Opt Webinar: Jiri Outrata

Title: On the solution of static contact problems with Coulomb friction via the semismooth* Newton method

Speaker: Jiri Outrata (Institute of Information Theory and Automation of the Czech Academy of Sciences)

Date and Time: April 28th, 2021, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: The lecture deals with application of a new Newton-type method to the numerical solution of discrete 3D contact problems with Coulomb friction. This method suits well to the solution of inclusions and the resulting conceptual algorithm exhibits, under appropriate conditions, the local superlinear convergence. After a description of the method a new model for the considered contact problem, amenable to the application of the new method, will be presented. The second part of the talk is then devoted to an efficient implementation of the general algorithm and to numerical tests. Throughout the whole lecture, various tools of modern variational analysis will be employed.

VA & Opt Webinar: Ewa Bednarczuk

Title: On duality for nonconvex minimization problems within the framework of abstract convexity

Speaker: Ewa Bednarczuk (Warsaw University of Technology and Systems Research Institute of the PAS)

Date and Time: April 21st, 2021, 17:00 AEST (Register here for remote connection via Zoom)

VA & Opt Webinar: Roger Behling

Title: Circumcentering projection type methods

Speaker: Roger Behling (Fundação Getúlio Vargas)

Date and Time: April 14th, 2021, 11:00 AEST (Register here for remote connection via Zoom)

Abstract: Enforcing successive projections, averaging the composition of reflections and barycentering projections are settled techniques for solving convex feasibility problems. These schemes are called the method of alternating projections (MAP), the Douglas-Rachfort method (DRM) and the Cimmino method (CimM), respectively. Recently, we have developed the circumcentered-reflection method (CRM), whose iterations employ generalized circumcenters that are able to accelerate the aforementioned classical approaches both theoretically and numerically. In this talk, the main results on CRM are presented and a glimpse on future work will be provided as well.

VA & Opt Webinar: Russell Luke

Title: Inconsistent Stochastic Feasibility: the Case of Stochastic Tomography

Speaker: Russell Luke (University of Göttingen)

Date and Time: April 7th, 2021, 17:00 AEST (Register here for remote connection via Zoom)

In an X-FEL experiment, high-energy x-ray pulses are shot with high repetition rates on a stream of identical single biomolecules and the scattered photons are recorded on a pixelized detector. These experiments provide a new and unique route to macromolecular structure determination at room temperature, without the need for crystallization, and at low material usage. The main challenges in these experiments are the extremely low signal-to-noise ratio due to the very low expected photon count per scattering image (10-50) and the unknown orientation of the molecules in each scattering image.

Mathematically, this is a stochastic computed tomography problem where the goal is to reconstruct a three-dimensional object from noisy two-dimensional images of a nonlinear mapping whose orientation relative to the object is both random and unobservable. The idea is to develop of a two-step procedure for solving this problem. In the first step, we numerically compute a probability distribution associated with the observed patterns (taken together) as the stationary measure of a Markov chain whose generator is constructed from the individual observations. Correlation in the data and other a priori information is used to further constrain the problem and accelerate convergence to a stationary measure. With the stationary measure in hand, the second step involves solving a phase retrieval problem for the mean electron density relative to a fixed reference orientation.

The focus of this talk is conceptual, and involves re-envisioning projection algorithms as Markov chains. We already present some new routes to “old” results, and a fundamental new approach to understanding and accounting for numerical computation on conventional computers.

VA & Opt Webinar: Yboon Garcia Ramos

Title: Characterizing quasiconvexity of the pointwise infimum of a family of arbitrary translations of quasiconvex functions

Speaker: Yboon Garcia Ramos (Universidad del Pacífico)

Date and Time: March 31st, 2021, 11:00 AEDT (Register here for remote connection via Zoom)

Abstract: In this talk we will present some results concerning the problem of preserving quasiconvexity when summing up quasiconvex functions and we will relate it to the problem of preserving quasiconvexity when taking the infimum of a family of quasiconvex functions. To develop our study, the notion of quasiconvex family is introduced, and we establish various characterizations of such a concept.

Joint work with Fabián Flores, Universidad de Concepción and Nicolas Hadjisavvas, University of the Aegean.

VA & Opt Webinar: Huynh Van Ngai

Title: Generalized Nesterov’s accelerated proximal gradient algorithms with convergence rate of order o(1/k2)

Speaker: Huynh Van Ngai (University of Quy Nhon)

Date and Time: March 24th, 2021, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: The accelerated gradient method initiated by Nesterov is now recognized to be one of the most powerful tools for solving smooth convex optimization problems. This method improves significantly the convergence rate of function values from O(1/k) of the standard gradient method down to O(1/k2). In this paper, we present two generalized variants of Nesterov’s accelerated proximal gradient method for solving composition convex optimization problems in which the objective function is represented by the sum of a smooth convex function and a nonsmooth convex part. We show that with suitable ways to pick the sequences of parameters, the convergence rate for the function values of this proposed method is actually of order o(1/k2). Especially, when the objective function is p-uniformly convex for p>2, the convergence rate is of order O(\ln k/k^{2p/(p-2)}), and the convergence is linear if the objective function is strongly convex. By-product, we derive a forward-backward algorithm generalizing the one by Attouch-Peypouquet [SIAM J. Optim., 26(3), 1824-1834, (2016)], which produces a convergence sequence with a convergence rate of the function values of order o(1/k2).

VA & Opt Webinar: David Bartl

Title: Every compact convex subset of matrices is the Clarke Jacobian of some Lipschitzian mapping

Speaker: David Bartl (Silesian University in Opava)

Date and Time: March 17th, 2021, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: Given a non-empty compact convex subset P of m×n matrices, we show constructively that there exists a Lipschitzian mapping g : Rn → Rm such that its Clarke Jacobian ∂g(0) = P.

VA & Opt Webinar: Alexander Kruger

Title: Error bounds revisited

Speaker: Alexander Kruger (Federation University Australia)

Date and Time: March 10th, 2021, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: We propose a unifying general framework of quantitative primal and dual sufficient error bound conditions covering linear and nonlinear, local and global settings. We expose the roles of the assumptions involved in the error bound assertions, in particular, on the underlying space: general metric, Banach or Asplund. Employing special collections of slope operators, we introduce a succinct form of sufficient error bound conditions, which allows one to combine in a single statement several different assertions: nonlocal and local primal space conditions in complete metric spaces, and subdifferential conditions in Banach and Asplund spaces. In the nonlinear setting, we cover both the conventional and the ‘alternative’ error bound conditions.

It is a joint work with Nguyen Duy Cuong (Federation University). The talk is based on the paper: N. D. Cuong and A. Y. Kruger, Error bounds revisited, arXiv: 2012.03941 (2020).

VA & Opt Webinar: Javier Peña

Title: The condition number of a function relative to a set

Speaker: Javier Peña (Carnegie-Mellon University)

Date and Time: March 3rd, 2021, 11:00 AEDT (Register here for remote connection via Zoom)

Abstract: The condition number of a differentiable convex function, namely the ratio of its smoothness to strong convexity constants, is closely tied to fundamental properties of the function. In particular, the condition number of a quadratic convex function is the square of the aspect ratio of a canonical ellipsoid associated to the function. Furthermore, the condition number of a function bounds the linear rate of convergence of the gradient descent algorithm for unconstrained convex minimization.

We propose a condition number of a differentiable convex function relative to a reference set and distance function pair. This relative condition number is defined as the ratio of a relative smoothness to a relative strong convexity constants. We show that the relative condition number extends the main properties of the traditional condition number both in terms of its geometric insight and in terms of its role in characterizing the linear convergence of first-order methods for constrained convex minimization.

This is joint work with David H. Gutman at Texas Tech University.

1 2 3 4 5 6 7