VA & Opt Webinar: Scott Lindstrom

Title: A primal/dual computable approach to improving spiraling algorithms, based on minimizing spherical surrogates for Lyapunov functions

Speaker: Scott Lindstrom (Curtin University)

Date and Time: June 2nd, 2021, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: Optimization problems are frequently tackled by iterative application of an operator whose fixed points allow for fast recovery of locally optimal solutions. Under light-weight assumptions, stability is equivalent to existence of a function—called a Lyapunov function—that encodes structural information about both the problem and the operator. Lyapunov functions are usually hard to find, but if a practitioner had a priori knowledge—or a reasonable guess—about one’s structure, they could equivalently tackle the problem by seeking to minimize the Lyapunov function directly. We introduce a class of methods that does this. Interestingly, for certain feasibility problems, circumcentered-reflection method (CRM) is an extant example therefrom. However, CRM may not lend itself well to primal/dual adaptation, for reasons we show. Motivated by the discovery of our new class, we experimentally demonstrate the success of one of its other members, implemented in a primal/dual framework.

VA & Opt Webinar: Guoyin Li

Title: Proximal methods for nonsmooth and nonconvex fractional programs: when sparse optimization meets fractional programs

Speaker: Guoyin Li (UNSW)

Date and Time: May 26th, 2021, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: Nonsmooth and nonconvex fractional programs are ubiquitous and also highly challenging. It includes the composite optimization problems studied extensively lately, and encompasses many important modern optimization problems arising from diverse areas such as the recent proposed scale invariant sparse signal reconstruction problem in signal processing, the robust Sharpe ratio optimization problems in finance and the sparse generalized eigenvalue problem in discrimination analysis. In this talk, we will introduce extrapolated proximal methods for solving nonsmooth and nonconvex fractional programs and analyse their convergence behaviour. Interestingly, we will show that the proposed algorithm exhibits linear convergence for sparse generalized eigenvalue problem with either cardinality regularization or sparsity constraints. This is achieved by identifying the explicit desingularization function of the Kurdyka-Lojasiewicz inequality for the merit function of the fractional optimization models. Finally, if time permits, we will present some preliminary encouraging numerical results for the proposed methods for sparse signal reconstruction and sparse Fisher discriminant analysis.

Postdoctoral Fellow – Mathematical Optimization

The Postdoctoral Fellow will undertake collaborative and self-directed research on an ARC-funded Discovery Project titled “Data-driven multistage robust optimization”. The primary research goals are to make a major contribution to the understanding of optimization in the face of data uncertainty and to develop mathematical principles for broad classes of multi-stage robust optimization problems, to design associated data-driven numerical methods to find solutions to these problems, and to provide an advanced optimization framework to solve a wide range of real-life optimization models of multi-stage technical decision-making under uncertain environments.

For more information, please refer to

https://external-careers.jobs.unsw.edu.au/cw/en/job/501990/postdoctoral-fellow-mathematical-optimization

VA & Opt Webinar: Yura Malitsky

Title: Adaptive gradient descent without descent

Speaker: Yura Malitsky (Linköping University)

Date and Time: May 19th, 2021, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: In this talk I will present some recent results for the most classical optimization method — gradient descent. We will show that a simple zero cost rule is sufficient to completely automate gradient descent. The method adapts to the local geometry, with convergence guarantees depending only on the smoothness in a neighborhood of a solution. The presentation is based on a joint work with K. Mishchenko, see arxiv.org/abs/1910.09529.

VA & Opt Webinar: Hung Phan

Title: Adaptive splitting algorithms for the sum of operators

Speaker: Hung Phan (University of Massachusetts Lowell)

Date and Time: May 12th, 2021, 11:00 AEST (Register here for remote connection via Zoom)

Abstract: A general optimization problem can often be reduced to finding a zero of a sum of multiple (maximally) monotone operators, which creates challenging computational tasks as a whole. It motivates the development of splitting algorithms in order to simplify the computations by dealing with each operator separately, hence the name “splitting”. Some of the most successful splitting algorithms in applications are the forward-backward algorithm, the Douglas-Rachford algorithm, and the alternating directions method of multipliers (ADMM). In this talk, we discuss some adaptive splitting algorithms for finding a zero of the sum of operators. The main idea is to adapt the algorithm parameters to the generalized monotonicity of the operators so that the generated sequence converges to a fixed point.

VA & Opt Webinar: Lyudmila Polyakova

Title: Smooth approximations of D.C. functions

Speaker: Lyudmila Polyakova (Saint-Petersburg State University)

Date and Time: May 5th, 2021, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: An investigation of properties of difference of convex functions is based on the basic facts and theorems of convex analysis, as the class of convex functions is one of the most investigated among nonsmooth functions. For an arbitrary convex function a family of continuously differentiable approximations is constructed using the infimal convolution operation. If the domain of the considered function is compact then such smooth convex approximations are uniform in the Chebyshev metric. Using this technique a smooth approximation is constructed for the d.c. functions. The optimization properties of these approximations are studied.

VA & Opt Webinar: Jiri Outrata

Title: On the solution of static contact problems with Coulomb friction via the semismooth* Newton method

Speaker: Jiri Outrata (Institute of Information Theory and Automation of the Czech Academy of Sciences)

Date and Time: April 28th, 2021, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: The lecture deals with application of a new Newton-type method to the numerical solution of discrete 3D contact problems with Coulomb friction. This method suits well to the solution of inclusions and the resulting conceptual algorithm exhibits, under appropriate conditions, the local superlinear convergence. After a description of the method a new model for the considered contact problem, amenable to the application of the new method, will be presented. The second part of the talk is then devoted to an efficient implementation of the general algorithm and to numerical tests. Throughout the whole lecture, various tools of modern variational analysis will be employed.

Postdoc position and PhD position: Optimization for Machine Learning (Linköping Univ)

Postdoc position Optimization for Machine Learning (Linköping Univ)
The department of Mathematics at Linkoping University seeks applications for a Postdoc position in the area of mathematical optimization theory with the aim of developing new techniques for machine learning. The position is for 2 years.

The doctoral student will be supervised by Dr. Oleg Burdakov and Dr. Yura Malitsky  

The application deadline is May 21, 2021.  

The full job ad: https://liu.se/en/work-at-liu/vacancies?rmpage=job&rmjob=16145&rmlang=UK

PhD student in Optimization for Machine Learning (Linköping Univ)
The department of Mathematics at Linkoping University seeks applications for a PhD student position in the area of mathematical optimization theory with the aim of developing new techniques for machine learning. The position is for 4 years without teaching or 5 years with teaching. 

The doctoral student will be supervised by Dr. Oleg Burdakov and
Dr. Yura Malitsky 

The application deadline is May 7, 2021. 

VA & Opt Webinar: Ewa Bednarczuk

Title: On duality for nonconvex minimization problems within the framework of abstract convexity

Speaker: Ewa Bednarczuk (Warsaw University of Technology and Systems Research Institute of the PAS)

Date and Time: April 21st, 2021, 17:00 AEST (Register here for remote connection via Zoom)

VA & Opt Webinar: Roger Behling

Title: Circumcentering projection type methods

Speaker: Roger Behling (Fundação Getúlio Vargas)

Date and Time: April 14th, 2021, 11:00 AEST (Register here for remote connection via Zoom)

Abstract: Enforcing successive projections, averaging the composition of reflections and barycentering projections are settled techniques for solving convex feasibility problems. These schemes are called the method of alternating projections (MAP), the Douglas-Rachfort method (DRM) and the Cimmino method (CimM), respectively. Recently, we have developed the circumcentered-reflection method (CRM), whose iterations employ generalized circumcenters that are able to accelerate the aforementioned classical approaches both theoretically and numerically. In this talk, the main results on CRM are presented and a glimpse on future work will be provided as well.

1 12 13 14 15 16 25