NSW ANZIAM 2021 Mid-year Conference

The New South Wales branch of ANZIAM will hold a one-day virtual conference on Friday 9th July.  A Zoom link will be sent at a later date.

The conference will run from approximately 10am to 5.30pm (depending upon the number of speakers).

This meeting features the 2020 winner of WIMSIG Maryam Mirzakhani Award, Dr Hoa Bui (Curtin) as the invited speaker. Title: Optimisation Methods for Maintenance Scheduling in the Mining Industry.

There is no registration fee.

Read more

VA & Opt Webinar: Adil Bagirov

Title: Nonsmooth DC optimization: recent developments

Speaker: Adil Bagirov (Federation University)

Date and Time: June 23rd, 2021, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: In this talk we consider unconstrained optimization problems where the objective functions are represented as a difference of two convex (DC) functions. Various applications of DC optimization in machine learning are presented. We discuss two different approaches to design methods of nonsmooth DC optimization: an approach based on the extension of bundle methods and an approach based on the DCA (difference of convex algorithm). We also discuss numerical results obtained using these methods.

VA & Opt Webinar: Bruno F. Lourenço

Title: Error bounds, amenable cones and beyond

Speaker: Bruno F. Lourenço (Institute of Statistical Mathematics)

Date and Time: June 16th, 2021, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: In this talk we present an overview of the theory of amenable cones, facial residual functions and their applications to error bounds for conic linear systems. A feature of our results is that no constraint qualifications are ever assumed, so they are applicable even to some problems with unfavourable theoretical properties. Time allowing, we will discuss some recent findings on the geometry of amenable cones and also some extensions for non-amenable cones.

OPTIMA Postdoctoral Research Fellows (A or B) fixed term 3 years (full time or part-time) at the University of Melbourne

The Research Fellow is expected to conduct world-class research and provide training for research students working in industrial optimisation as a key appointee in a newly established ARC Training Centre in Optimisation Technologies, Integrated Methodologies and Applications (OPTIMA). The research program involves a focus on model-based and black-box optimisation methodologies of relevance to a broad range of industry partner optimisation challenges. A
multidisciplinary approach is expected, drawing from techniques developed in mathematics, computer science, statistics, engineering, and economics.
For more details, please refer to
http://jobs.unimelb.edu.au/caw/en/job/904801/optima-postdoctoral-research-fellow

Postdoctoral Fellow – Computational Mathematics

A position of Postdoctoral Fellow is available at UNSW Sydney. The Postdoctoral Fellow will undertake collaborative and self-directed research on an ARC-funded Discovery Project. For more details and to apply please visit https://external-careers.jobs.unsw.edu.au/cw/en/job/502136/postdoctoral-fellow-computational-mathematics.

For more information please contact Thong Le Gia qlegia@unsw.edu.au or Ian Sloan i.sloan@unsw.edu.au.

Applications close July 6th, 2021

Workshop on the Intersections of Computation and Optimisations

Workshop on the Intersections of Computation and Optimisations

https://www.mocao.org/WICO

MoCaO (Mathematics of Computation and Optimisation) is planning a new workshop for late 2021 which is sponsored by the ANU, UNSW and AMSI.

This workshop intends to bring together researchers from the areas of computation, optimisation, computing sciences and engineering interested in the cross- fertilization of ideas around the following theme:

 Optimisation often faces unique issues when there is a need to efficiently compute. On the other hand, computational techniques at times utilise optimisation within their algorithms. Both areas fundamentally need to understand approximation in all its facets which is also fundamental to computation as are the associated notions of convergence. Indeed, recent research has blurred the boundaries between optimisation (continuous and discrete), computation and areas of computing science. The area of machine learning has crept into relevance everywhere. Recently research has turned to its use in computational techniques including the enhancement of optimisation algorithms and the cycle of cross fertilisation of ideas has continued to date.

Workshop Format

We intend to run the workshop in a blended format, involving a face to face component which will be held at the ANU mathematics school in conjunction with a simultaneous\parallel online format to which both group of participants will engage. Some keynotes will present in person (streamed online from ANU) and other will engage totally online in a remote format. We encourage local and international participants to take part in the online workshop. In addition to their keynote presentations, keynotes who will be invited to give a lectorial-discussion session that will promote research questions and engage emerging researchers in these areas.

Keynotes Speakers:

Prof Gerlind Plonka-Hoch (University of Goettingen, Germany)

Prof. Frances Kuo (UNSW)

Prof Stefan Wild (Argonne, USA)

Prof Stephen Wright (Wisconsin, USA) 

Prof. Ian Turner (QUT)

Prof. Claudia Sagastizabal (IMECC-Unicamp and CEMEAI, Brazil)

Prof Martin Berggren (Umeå University, Sweden)

Important dates:

Registration Opens: 07/06/2021

Workshop Dates: 22/11/2021 to 25/11/2021

Local Organising Committee

Prof. Andrew Eberhard (RMIT) andy.eberhard@rmit.edu.au

Prof. Stephen Roberts (ANU) stephen.roberts@anu.edu.au

Prof. Markus Hegland (ANU) markus.hegland@anu.edu.au

Dr. Matthew Tam (UniMelb) matthew.tam@unimelb.edu.au

Dr. Nadia Sukhorukova (Swinburne) nsukhorukova@swin.edu.au

Future Announcements and Grants:

We intend to follow up with regular announcements regarding workshop accommodation, details on format and software and funding opportunities for ECR, PHD and female participants. We also wish to draw female participants attention to the possibility of applying for the WIMSIG Cheryl E. Praeger Travel Award (support for attending conferences/visiting collaborators) and/or the WIMSIG Anne Penfold Street Awards (support for careering responsible while attending conferences/visiting collaborators).

VA & Opt Webinar: Vuong Phan

Title: The Boosted Difference of Convex Functions Algorithm

Speaker: Vuong Phan (University of Southampton)

Date and Time: June 9th, 2021, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: We introduce a new algorithm for solving Difference of Convex functions (DC) programming, called Boosted Difference of Convex functions Algorithm (BDCA). BDCA accelerates the convergence of the classical difference of convex functions algorithm (DCA) thanks to an additional line search step. We prove that any limit point of the BDCA iterative sequence is a critical point of the problem under consideration and that the corresponding objective value is monotonically decreasing and convergent. The global convergence and convergence rate of the iterations are obtained under the Kurdyka Lojasiewicz property. We provide applications and numerical experiments for a hard problem in biochemistry and two challenging problems in machine learning, demonstrating that BDCA outperforms DCA. For the biochemistry problem, BDCA was ve times faster than DCA, for the Minimum Sum-of-Squares Clustering problem, BDCA was on average sixteen times faster than DCA, and for the Multidimensional Scaling problem, BDCA was three times faster than DCA.

Joint work with Francisco J. Aragón Artacho (University of Alicante, Spain).

VA & Opt Webinar: Scott Lindstrom

Title: A primal/dual computable approach to improving spiraling algorithms, based on minimizing spherical surrogates for Lyapunov functions

Speaker: Scott Lindstrom (Curtin University)

Date and Time: June 2nd, 2021, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: Optimization problems are frequently tackled by iterative application of an operator whose fixed points allow for fast recovery of locally optimal solutions. Under light-weight assumptions, stability is equivalent to existence of a function—called a Lyapunov function—that encodes structural information about both the problem and the operator. Lyapunov functions are usually hard to find, but if a practitioner had a priori knowledge—or a reasonable guess—about one’s structure, they could equivalently tackle the problem by seeking to minimize the Lyapunov function directly. We introduce a class of methods that does this. Interestingly, for certain feasibility problems, circumcentered-reflection method (CRM) is an extant example therefrom. However, CRM may not lend itself well to primal/dual adaptation, for reasons we show. Motivated by the discovery of our new class, we experimentally demonstrate the success of one of its other members, implemented in a primal/dual framework.

VA & Opt Webinar: Guoyin Li

Title: Proximal methods for nonsmooth and nonconvex fractional programs: when sparse optimization meets fractional programs

Speaker: Guoyin Li (UNSW)

Date and Time: May 26th, 2021, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: Nonsmooth and nonconvex fractional programs are ubiquitous and also highly challenging. It includes the composite optimization problems studied extensively lately, and encompasses many important modern optimization problems arising from diverse areas such as the recent proposed scale invariant sparse signal reconstruction problem in signal processing, the robust Sharpe ratio optimization problems in finance and the sparse generalized eigenvalue problem in discrimination analysis. In this talk, we will introduce extrapolated proximal methods for solving nonsmooth and nonconvex fractional programs and analyse their convergence behaviour. Interestingly, we will show that the proposed algorithm exhibits linear convergence for sparse generalized eigenvalue problem with either cardinality regularization or sparsity constraints. This is achieved by identifying the explicit desingularization function of the Kurdyka-Lojasiewicz inequality for the merit function of the fractional optimization models. Finally, if time permits, we will present some preliminary encouraging numerical results for the proposed methods for sparse signal reconstruction and sparse Fisher discriminant analysis.

Postdoctoral Fellow – Mathematical Optimization

The Postdoctoral Fellow will undertake collaborative and self-directed research on an ARC-funded Discovery Project titled “Data-driven multistage robust optimization”. The primary research goals are to make a major contribution to the understanding of optimization in the face of data uncertainty and to develop mathematical principles for broad classes of multi-stage robust optimization problems, to design associated data-driven numerical methods to find solutions to these problems, and to provide an advanced optimization framework to solve a wide range of real-life optimization models of multi-stage technical decision-making under uncertain environments.

For more information, please refer to

https://external-careers.jobs.unsw.edu.au/cw/en/job/501990/postdoctoral-fellow-mathematical-optimization

1 11 12 13 14 15 25