Congratulations! Vera Roshchina (UNSW) has been awarded the 2021 Christopher Heyde Medal

Dear MoCaO members.

I would like to share an exciting news: Dr. Vera Roshchina (UNSW) has been awarded the 2021 Christopher Heyde Medal:

https://www.science.org.au/supporting-science/awards-and-opportunities/honorific-awardees/2021-awardees#heyde

The award’s purpose is to recognise outstanding research in the mathematical sciences by researchers up to 10 years post PhD. Vera is an excellent mathematician, active MoCaO member and a great person to work with.

Congratulations, Vera!!!!!!!

VA & Opt Webinar: David Bartl

Title: Every compact convex subset of matrices is the Clarke Jacobian of some Lipschitzian mapping

Speaker: David Bartl (Silesian University in Opava)

Date and Time: March 17th, 2021, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: Given a non-empty compact convex subset P of m×n matrices, we show constructively that there exists a Lipschitzian mapping g : Rn → Rm such that its Clarke Jacobian ∂g(0) = P.

LION15: The 15th Learning and Intelligent Optimization Conference, Athens, Greece, June 20-25, 2021

LION15: The 15th Learning and Intelligent Optimization Conference
Athens, Greece, June 20-25, 2021
—————————————————————
Conference website https://lion15.sba-research.org/index.html
Submission link https://easychair.org/conferences/?conf=lion15
Extended Submission deadline March 15, 2021

The large variety of heuristic algorithms for hard optimization problems raises numerous interesting and challenging issues. Practitioners using heuristic algorithms for hard optimization problems are confronted with the burden of selecting the most appropriate method, in many cases through expensive algorithm configuration and parameter tuning. Scientists seek theoretical insights and demand a sound experimental methodology for evaluating algorithms and assessing strengths and weaknesses. This effort requires a clear separation between the algorithm and the experimenter, who, in too many cases, is “in the loop” as a motivated intelligent learning component. LION deals with designing and engineering ways of “learning” about the performance of different techniques, and ways of using past experience about the algorithm behavior to improve performance in the future. Intelligent learning schemes for mining the knowledge obtained online or offline can improve the algorithm design process and simplify the applications of high-performance optimization methods. Combinations of different algorithms can further improve the robustness and performance of the individual components.

This meeting explores the intersections and uncharted territories between machine learning, artificial intelligence, energy, mathematical programming and algorithms for hard optimization problems. The main purpose of the event is to bring together experts from these areas to discuss new ideas and methods, challenges and opportunities in various application areas, general trends and specific developments. We are excited to be bringing the LION conference in Greece for the third time.

VA & Opt Webinar: Alexander Kruger

Title: Error bounds revisited

Speaker: Alexander Kruger (Federation University Australia)

Date and Time: March 10th, 2021, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: We propose a unifying general framework of quantitative primal and dual sufficient error bound conditions covering linear and nonlinear, local and global settings. We expose the roles of the assumptions involved in the error bound assertions, in particular, on the underlying space: general metric, Banach or Asplund. Employing special collections of slope operators, we introduce a succinct form of sufficient error bound conditions, which allows one to combine in a single statement several different assertions: nonlocal and local primal space conditions in complete metric spaces, and subdifferential conditions in Banach and Asplund spaces. In the nonlinear setting, we cover both the conventional and the ‘alternative’ error bound conditions.

It is a joint work with Nguyen Duy Cuong (Federation University). The talk is based on the paper: N. D. Cuong and A. Y. Kruger, Error bounds revisited, arXiv: 2012.03941 (2020).

VA & Opt Webinar: Javier Peña

Title: The condition number of a function relative to a set

Speaker: Javier Peña (Carnegie-Mellon University)

Date and Time: March 3rd, 2021, 11:00 AEDT (Register here for remote connection via Zoom)

Abstract: The condition number of a differentiable convex function, namely the ratio of its smoothness to strong convexity constants, is closely tied to fundamental properties of the function. In particular, the condition number of a quadratic convex function is the square of the aspect ratio of a canonical ellipsoid associated to the function. Furthermore, the condition number of a function bounds the linear rate of convergence of the gradient descent algorithm for unconstrained convex minimization.

We propose a condition number of a differentiable convex function relative to a reference set and distance function pair. This relative condition number is defined as the ratio of a relative smoothness to a relative strong convexity constants. We show that the relative condition number extends the main properties of the traditional condition number both in terms of its geometric insight and in terms of its role in characterizing the linear convergence of first-order methods for constrained convex minimization.

This is joint work with David H. Gutman at Texas Tech University.

VA & Opt Webinar: Nguyen Duy Cuong

Title: Necessary conditions for transversality properties

Speaker: Nguyen Duy Cuong (Federation University)

Date and Time: February 24th, 2021, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: Transversality properties of collections of sets play an important role in optimization and variational analysis, e.g., as constraint qualifications, qualification conditions in subdifferential, normal cone and coderivative calculus, and convergence analysis of computational algorithms. In this talk, we present some new results on primal (geometric, metric, slope) and dual (subdifferential, normal cone) necessary (in some cases also sufficient) conditions for transversality properties in both linear and nonlinear settings. Quantitative relations between transversality properties and the corresponding regularity properties of set-valued mappings are also discussed.

VA & Opt Webinar: Alexander J. Zaslavski

Title: Subgradient Projection Algorithm with Computational Errors

Speaker: Alexander J. Zaslavski (The Technion – Israel Institute of Technology)

Date and Time: February 17th, 2021, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: We study the subgradient projection algorithm for minimization of convex and nonsmooth functions, under the presence of computational errors. We show that our algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Moreover, for a known computational error, we find out what an approximate solution can be obtained and how many iterates one needs for this.

Positions Available at UNSW: Applied Maths, Pure Maths & Data Science


Five positions are currently available in the School of Mathematics and Statistics. Please see the position descriptions below, for:

Please apply for these positions via Jobs@UNSW

For more information, please refer to:

https://www.maths.unsw.edu.au/news/2021-02/positions-available

Master’s program in Economic Analysis IDEA, at the Universidad Autònoma de Barcelona

Master’s program in Economic Analysis IDEA, at the Universidad Autònoma de Barcelona is available for the 2021-2022 academic year of scholarships from the Severo Ochoa program to fund the most outstanding applicants.

The IDEA master is a two-year program with a genuinely international vocation and committed to rigorous and quality training. The program is taught entirely in English.

The graduates of the program are highly appreciated in the labor market, both in national and international markets, and a significant number of them continue their training in our Doctorate in Economic Analysis as well as in prestigious national and international doctoral programs.

Complete information about our program is available on the Master’s degree website http://idea.uab.es/ .

VA & Opt Webinar: Nam Ho-Nguyen

Title: Coordinate Descent Without Coordinates: Tangent Subspace Descent on Riemannian Manifolds

Speaker: Nam Ho-Nguyen (University of Sydney)

Date and Time: February 10th, 2021, 11:00 AEDT (Register here for remote connection via Zoom)

Abstract: We consider an extension of the coordinate descent algorithm to manifold domains, and provide convergence analyses for geodesically convex and non-convex smooth objective functions. Our key insight is to draw an analogy between coordinate blocks in Euclidean space and tangent subspaces of a manifold. Hence, our method is called tangent subspace descent (TSD). The core principle behind ensuring convergence of TSD is the appropriate choice of subspace at each iteration. To this end, we propose two novel conditions: the gap ensuring and C-randomized norm conditions on deterministic and randomized modes of subspace selection respectively. These ensure convergence for smooth functions, and are satisfied in practical contexts. We propose two subspace selection rules of particular practical interest that satisfy these conditions: a deterministic one for the manifold of square orthogonal matrices, and a randomized one for the more general Stiefel manifold. (This is joint work with David Huckleberry Gutman, Texas Tech University.)

1 13 14 15 16 17 25