VA & Opt Webinar: Nguyen Duy Cuong

Title: Necessary conditions for transversality properties

Speaker: Nguyen Duy Cuong (Federation University)

Date and Time: February 24th, 2021, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: Transversality properties of collections of sets play an important role in optimization and variational analysis, e.g., as constraint qualifications, qualification conditions in subdifferential, normal cone and coderivative calculus, and convergence analysis of computational algorithms. In this talk, we present some new results on primal (geometric, metric, slope) and dual (subdifferential, normal cone) necessary (in some cases also sufficient) conditions for transversality properties in both linear and nonlinear settings. Quantitative relations between transversality properties and the corresponding regularity properties of set-valued mappings are also discussed.

VA & Opt Webinar: Alexander J. Zaslavski

Title: Subgradient Projection Algorithm with Computational Errors

Speaker: Alexander J. Zaslavski (The Technion – Israel Institute of Technology)

Date and Time: February 17th, 2021, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: We study the subgradient projection algorithm for minimization of convex and nonsmooth functions, under the presence of computational errors. We show that our algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Moreover, for a known computational error, we find out what an approximate solution can be obtained and how many iterates one needs for this.

VA & Opt Webinar: Nam Ho-Nguyen

Title: Coordinate Descent Without Coordinates: Tangent Subspace Descent on Riemannian Manifolds

Speaker: Nam Ho-Nguyen (University of Sydney)

Date and Time: February 10th, 2021, 11:00 AEDT (Register here for remote connection via Zoom)

Abstract: We consider an extension of the coordinate descent algorithm to manifold domains, and provide convergence analyses for geodesically convex and non-convex smooth objective functions. Our key insight is to draw an analogy between coordinate blocks in Euclidean space and tangent subspaces of a manifold. Hence, our method is called tangent subspace descent (TSD). The core principle behind ensuring convergence of TSD is the appropriate choice of subspace at each iteration. To this end, we propose two novel conditions: the gap ensuring and C-randomized norm conditions on deterministic and randomized modes of subspace selection respectively. These ensure convergence for smooth functions, and are satisfied in practical contexts. We propose two subspace selection rules of particular practical interest that satisfy these conditions: a deterministic one for the manifold of square orthogonal matrices, and a randomized one for the more general Stiefel manifold. (This is joint work with David Huckleberry Gutman, Texas Tech University.)

VA & Opt Webinar: Ernest Ryu

Title: Scaled Relative Graph: Nonexpansive operators via 2D Euclidean Geometry

Speaker: Ernest Ryu (Seoul National University)

Date and Time: November 25th, 2020, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: Many iterative methods in applied mathematics can be thought of as fixed-point iterations, and such algorithms are usually analyzed analytically, with inequalities. In this work, we present a geometric approach to analyzing contractive and nonexpansive fixed point iterations with a new tool called the scaled relative graph (SRG). The SRG provides a rigorous correspondence between nonlinear operators and subsets of the 2D plane. Under this framework, a geometric argument in the 2D plane becomes a rigorous proof of contractiveness of the corresponding operator.

VA & Opt Webinar: Aram Arutyunov & S.E. Zhukovskiy

Title: Local and Global Inverse and Implicit Function Theorems

Speaker: Aram Arutyunov (Moscow State University) & S.E. Zhukovskiy (V. A. Trapeznikov Institute of Control Sciences of RAS)

Date and Time: November 18th, 2020, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: In the talk, we present a local inverse function theorem on a cone in a neighbourhood of abnormal point. We present a global inverse function theorem in the form of theorem on trivial bundle, guaranteeing that if a smooth mapping of finite-dimensional spaces is uniformly nonsingular, then it has a smooth right inverse satisfying a priori estimate. We also present a global implicit function theorem guaranteeing the existence and continuity of a global implicit function under the condition that the mappings in question are uniformly nonsingular. The generalization of these results to the case of mappings of Hilbert spaces and Banach spaces are discussed.

VA & Opt Webinar: Vinesha Peiris

Title: The extension of linear inequality method for generalised rational Chebyshev approximation

Speaker: Vinesha Peiris (Swinburne)

Date and Time: November 11th, 2020, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: In this talk we will demonstrate the correspondence between the linear inequality method developed for rational Chebyshev approximation and the bisection method used in quasiconvex optimisation. It naturally connects rational and generalised rational Chebyshev approximation problems with modern developments in the area of quasiconvex functions. Moreover, the linear inequality method can be extended to a broader class of Chebyshev approximation problems, where the corresponding objective functions remain quasiconvex.

VA & Opt Webinar: Chayne Planiden (UoW)

Title: New Gradient and Hessian Approximation Methods for Derivative-free Optimisation

Speaker: Chayne Planiden (UoW)

Date and Time: November 4th, 2020, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: In general, derivative-free optimisation (DFO) uses approximations of first- and second-order information in minimisation algorithms. DFO is found in direct-search, model-based, trust-region and other mainstream optimisation techniques and is gaining popularity in recent years. This work discusses previous results on some particular uses of DFO: the proximal bundle method and the VU-algorithm, and then presents improvements made this year on the gradient and Hessian approximation techniques. These improvements can be inserted into any routine that requires such estimations.

VA & Opt Webinar: Radek Cibulka (University of West Bohemia)

Title: Continuous selections for inverse mappings in Banach spaces

Speaker: Radek Cibulka (University of West Bohemia)

Date and Time: October 28th, 2020, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: Influenced by a recent work by A. V. Arutyunov, A. F. Izmailov,  and S. E. Zhukovskiy,  we establish a general Ioffe-type criterion guaranteeing the existence of a continuous and calm selection for the inverse of a single-valued uniformly continuous mapping between Banach spaces with  a closed domain.  We show that the general statement yields elegant proofs  following  the same pattern as in the case of the usual openness with a linear rate by  considering mappings instead of points. As in the case of the Ioffe’s criterion for linear openness around the reference point, this allows us to avoid the iteration, that is, the construction of a sequence of continuous functions  the limit of which is the desired continuous selection for the inverse mapping, which is illustrated on the proof of the Bartle-Graves theorem.  Then we formulate sufficient conditions based on approximations given by positively homogeneous mappings and bunches of linear operators. The talk is based on a joint work with Marián Fabian.

VA & Opt Webinar: Wilfredo Sosa (UCB)

Title:On diametrically maximal sets, maximal premonotone maps and promonote bifunctions

Speaker: Wilfredo Sosa (UCB)

Date and Time: October 21st, 2020, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: First, we study diametrically maximal sets in the Euclidean space (those which are not properly contained in a set with the same diameter), establishing their main properties. Then, we use these sets for exhibiting an explicit family of maximal premonotone operators. We also establish some relevant properties of maximal premonotone operators, like their local boundedness, and finally we introduce the notion premonotone bifunctions, presenting a canonical relation between premonotone operators and bifunctions, that extends the well known one, which holds in the monotone case.

VA & Opt Webinar: Björn Rüffer (UoN)

Title: A Lyapunov perspective to projection algorithms

Speaker: Björn Rüffer (UoN)

Date and Time: October 14th, 2020, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: The operator theoretic point of view has been very successful in the study of iterative splitting methods under a unified framework. These algorithms include the Method of Alternating Projections as well as the Douglas-Rachford Algorithm, which is dual to the Alternating Direction Method of Multipliers, and they allow nice geometric interpretations. While convergence results for these algorithms have been known for decades when problems are convex, for non-convex problems progress on convergence results has significantly increased once arguments based on Lyapunov functions were used. In this talk we give an overview of the underlying techniques in Lyapunov’s direct method and look at convergence of iterative projection methods through this lens.

1 3 4 5 6 7