WOMBAT/WICO 2023 (joint Optimisation and Computational Mathematics Workshops)

This year, the regular workshops WOMBAT (annual Workshop on Optimisation, Metric Bounds, Approximation and Transversality) and WICO (biennial Workshop on the Intersections of Computation and Optimisation) will be combined. The joint workshop, covering areas of optimisation and computational mathematics, will be held from 11-15 December 2023 at the University of Sydney. The event will be entirely in-person.

Plenary speakers:

  • Andreas Ernst (Monash University)
  • Fatma Kılınç-Karzan (Carnegie Mellon University)
  • Andrea Raith (University of Auckland)
  • Ricardo Ruiz-Baier (Monash University)
  • Georg Stadler (New York University)

Registration is free to all participants and open until 31 October. Some travel support for students is available.

For more details (including the registration form), see the event website: https://wombat.mocao.org/

On behalf of the organising committee:

Mareike Dressler, Nam Ho-Nguyen, Quoc Le Gia, Dmytro Matsypura, Lindon Roberts

WOMBAT/WICO 2023 (joint Optimisation and Computational Mathematics Workshops)

This year, the regular workshops WOMBAT (annual Workshop on Optimisation, Metric Bounds, Approximation and Transversality) and WICO (biennial Workshop on the Intersections of Computation and Optimisation) will be combined. The joint workshop, covering areas of optimisation and computational mathematics, will be held from 11-15 December 2023 at the University of Sydney. The event will be entirely in-person.

Plenary speakers:

  • Andreas Ernst (Monash University)
  • Fatma Kılınç-Karzan (Carnegie Mellon University)
  • Andrea Raith (University of Auckland)
  • Ricardo Ruiz-Baier (Monash University)
  • Georg Stadler (New York University)

Registration is free to all participants and open until 31 October. Some travel support for students is available.

For more details (including the registration form), see the event website: https://wombat.mocao.org/

On behalf of the organising committee:

Mareike Dressler, Nam Ho-Nguyen, Quoc Le Gia, Dmytro Matsypura, Lindon Roberts

PhD Scholarship Opportunity

A Phd Scholarship in Mathematical Optimisation is available working with Dr. Scott B. Lindstrom at Centre for Optimisation and Decision Science, Curtin University. Please share this opportunity with any students who may be interested.

Eligibility

You must be one of the following:

  • Australian Citizen
  • Australian Permanent Resident
  • New Zealand Citizen
  • Permanent Humanitarian Visa

Deadline: no expressions of interest will be accepted after 25th August, 2023. This is a University deadline for all projects in this category, and it cannot be extended. Would-be applicants are strongly encouraged to submit their expressions of interest well in advance of this deadline.

Overview

The annual scholarship package (stipend and tuition fees) is approx. $60,000 – $70,000 p.a.

Successful HDR applicants for admission will receive a 100% fee offset for up to 4 years, stipend scholarships at the 2023 RTP rate valued at $32,250 p.a. for up to a maximum of 3 years, with a possible 6-month completion scholarship. Applicants are determined via a competitive selection process and will be notified of the scholarship outcome in November 2023. 

The official advertisement for the position is here, and more information is below.
For detailed information about RTP Scholarships, visit: Research Training Program (RTP) Scholarships | Curtin University, Perth, Australia.

Description

In data science, machine learning, and engineering, many problems take the form of finding a solution that minimizes a cost, subject to constraints on allowable solutions. Some examples of costs include expected financial losses, model prediction errors, and energy used. Some examples of constraints include resource limitations, minimum requirements on what is produced, and so forth.

These problems are solved with operator splitting methods, a modern class of non-linear optimisation algorithms that allow the constraint structure and cost structure to be treated as two components of a single unifying function. These algorithms were independently discovered by mathematicians working on physics and imaging problems, and they have been developed and improved with the powerful machinery of convex analysis.  

For many important problems, we desire to make these algorithms go faster, either to find solutions within the maximum time allowable (for example: balancing power flow in electricity grids) or to make better data science models computationally tractable for large data sets. Researchers have recently turned to studying the dynamical systems associated with operator splitting methods. This research is allowing us to prove results in nonconvex settings and build new algorithms. Dr. Scott Lindstrom recently introduced a meta-algorithm that uses operator dynamics to suggest alternative algorithm updates. The intent of this meta-algorithm is to solve surrogates for something called a Lyapunov function, which is an object that describes the dynamics. This meta-algorithm has already become state-of-the-art for finding wavelets with structural constraints (an imaging sciences problem). 

Scientific Aims: The scientific aim of this project is to identify classes of problems in data science, machine learning, and engineering for which meta-algorithms—such as the one described above—may be deliver superior performance. The approach will be multi-faceted, combining both computational experiment and rigorous proof. The results will be communicated in articles and submitted to peer-reviewed publications.

Upskilling Aims: The upskilling aims for the selected candidate are as follows (in no particular order). The candidate will build expertise in the algorithms that make it possible to solve many modern data science models and engineering problems, understanding both how the algorithms are designed, how geometry informs model selection, and what the outstanding challenges are. At the project’s completion, the candidate will be competent to rigorously conduct both experimental and theoretical mathematics research, and to communicate the results of their discoveries to others in the field. 

In the literature review component, you will learn the fundamental theory—convex analysis—of operator splitting and learn how operator splitting algorithms are formulated for solving various classes of problems. Some examples of the types of problems you will study are as follows: (1) least absolute deviations for outlier-resistant linear regression (a data science modelling problem), (2) progressive hedging for maximizing expected earnings (a finance problem), (3) computation of a one-norm centroid (a statistics problem), and (4) phase retrieval (a signal processing problem). 

In the experimental component, you will apply Lyapunov surrogate methods to solve those problems. You will build performance profiles, which are visualizations that allow researchers to compare the speeds of different algorithms. 

In the theoretical component, you will formally analyse the dynamical systems associated with operator splitting methods when they are applied to these problem classes. Particular emphasis will be placed on the duality of algorithms; duality is a fundamental concept in convex analysis.

You will document and communicate the findings in written articles. 

Background: In 2015, Australian researchers Jonathan M. Borwein and Brailey Sims seeded the study of dynamical systems for operator splitting methods. This has rapidly grown into an expansive field in its own right. A Google Scholar search for “Dynamical Systems and ADMM” returns 17,100 results in the last 8 years, with applications including distributed DC optimal power flow. The principal of this project, Dr. Scott Lindstrom, worked extensively with Borwein and Sims, and was one of the 41 invited participants at the 2017 Banff International Research Station workshop for the world’s leading experts on splitting algorithms. Together with Walaa Moursi (Waterloo) and Matthew K. Tam (Melbourne), he is co-organizing the 2025 Australian Matrix workshop on operator splitting. 

Dr. Lindstrom’s article introducing the Lyapunov Surrogate method published in Computational Optimization and Applications, a top journal in the field of optimization. The meta-algorithm has already been demonstrated state-of-the-art for finding structured wavelets, in research with Dr.s Lindstrom, Neil Dizon (U. Helsinki) and Jeffrey Hogan (U. Newcastle). 

Future context: As described in the overview, in the here and now, faster operator splitting methods will allow us to obtain better solutions to important problems in data science and energy. On a ten year horizon, this research advances an emerging new paradigm in problem solving, where artificial intelligence will observe an algorithm’s performance and suggest on-the-fly parameter adjustments and alternative updates for the iterates. Finally, the project builds fundamental knowledge in the mathematical sciences and equips the selected candidate with a skill set of extreme contemporary demand. 

Internship Opportunities: This project may provide an internship opportunity.

Interested applicants: should submit the expression of interest form on this page. Questions may be directed to Dr. Scott B. Lindstrom ( scott.lindstrom@curtin.edu.au ).

MoCaO Lectures: 2023: Polynomial optimisation – First Announcement

 July 3-7, 2023, 5-6pm AEST (GMT+10) each day

This series of lectures will introduce polynomial optimisation and decision problems, what they can model, how they can be attacked using tools from convex optimisation. The lectures will be illustrated, throughout, with a range of concrete applications. These lectures are designed to be accessible to novices to the field who have a mathematics and computational background, such as phd students, postdoc and/or inquisitive academics who wish to have a better understanding of recent advances in this dynamic field. These lectures will be given online via Zoom. Please read the notice below regarding the registration.

Summary: Optimisation and decision problems where the objective function and the constraints can be formulated using multivariate polynomials can model a very wide range of problems from areas as diverse as dynamical systems and control, probability and statistics, quantum information, and combinatorial optimisation. Such problems, while very expressive, are generally difficult to solve. Despite this, systematic and powerful methods based on tools from convex optimisation and convex geometry have been developed to globally approximate these challenging problems.  
MoCaO Lectures 2023:
James Saunderson (Monash University, ARC Discovery Early Career Research Fellow)
Georgina Hall (INSEAD)
Mareike Dressler (UNSW, Sydney)

Biographies:
James Saunderson (MoCaO lecturer 2023) is a Lecturer and ARC DECRA fellow in the Department of Electrical and Computer Systems Engineering at Monash University. He received a PhD in Electrical Engineering and Computer Science from MIT in 2015, and held postdoctoral positions at Caltech and the University of Washington before joining Monash. In 2020 he was the recipient (with Hamza Fawzi and Pablo Parrilo) of the SIAM activity group on optimization best paper prize in 2020.

Georgina Hall is an Assistant Professor at INSEAD in the Decision Sciences area. She received a PhD in Operations Research and Financial Engineering at Princeton University in 2018, where she was also a Gordon Y. S. Wu fellow. Before joining INSEAD, she held a postdoctoral position in the DYOGENE team at INRIA. Georgina was the recipient of the 2016 INFORMS Computing Society Best Student Paper Award, the 2018 INFORMS Optimization Society Young Researchers’ Prize, and the 2020 Information Theory Society Paper Award.

Mareike Dressler is a Lecturer in the School of Mathematics and Statistics at the University of New South Wales (UNSW Sydney). She received a PhD at the Goethe-Universität in Frankfurt am Main in 2018. Before joining UNSW she held postdoctoral positions at Brown University in the Institute for Computational and Experimental Research in Mathematics (ICERM), at the University of California, San Diego (UCSD), and at the Max Planck Institute for Mathematics in the Sciences (MPI MiS) in Leipzig. In 2023, Mareike was awarded a Simons Visiting Professorship by the Simons Foundation.

We encourage participants to register using the google form the bottom of the webpage (so you may receive the zoom details)

If you have any enquiries, please send an email to MoCaO@austms.org.au. Please check the website prior to the lectures for last minute information or announcements.

Registration via GoogleForms.

Alternatively, you can copy and paste the URL: https://forms.gle/hdUTzcBZqTuoVHLUA

Positions at the University of Melbourne and corrections to the previous post

Dear colleagues and friends, the positions mentioned in my previous post are advertised by the University of Melbourne (not RMIT). I am very sorry for the inconveniences. The positions are as follows.

Position
Number
Position NameAds Close
0059242Associate Professor/Professor in Applied Mathematics (continuing) https://jobs.unimelb.edu.au/en/job/912527/associate-professor-professor-in-applied-mathematics16 June 2023
0059243Associate Professor/Professor in Mathematics/Statistics (continuing)
https://jobs.unimelb.edu.au/en/job/912530/associate-professor-professor-in-mathematics-statistics
16 June 2023
0059245Associate Professor/Professor in Statistical Data Science (continuing) https://jobs.unimelb.edu.au/en/job/912531/associate-professor-professor-in-statistical-data-science16 June 2023
 

The PhD Scholarship in Mathematical Optimization at the University of Sydney

The PhD Scholarship in Mathematical Optimization at the University of Sydney will support an outstanding research student to undertake doctoral studies in mathematical optimization.

Potential research topics will be on mathematical optimization broadly defined, including theory and algorithms for optimization, stochastic programming, robust/distributionally robust optimization and data-driven optimization, with applications to machine learning, statistics, finance and economics.

The scholarship includes a living allowance of $37,207 AUD per annum and tuition for up to 3.5 years, with the possibility of a six-month extension subject to approval.

The scholarship recipient will join the Discipline of Business Analytics at the University of Sydney. The Discipline brings together researchers in the fields of optimization, operations management, statistics, machine learning and econometrics. It is particularly suited for students with interests at the intersection of two or more of these fields.

Applicants are expected to have a Masters by research or Honours undergraduate degree. We invite applicants to send the following material via email to Dr Nam Ho-Nguyen (nam.ho-nguyen@sydney.edu.au):

  • An up-to-date CV/resume, including contact details of two academic referees. We will notify applicants if we decide to contact referees.
  • Copies of academic transcripts.
  • (Optional) A short (2 page maximum) statement of purpose.

Forrest Fellowships

The Forrest Foundation is now accepting applications for Forrest Fellows. These Perth-based postdoctoral positions are highly competitive, equally prestigious, and pay for a fellow to embed in one of Perth’s universities. The foundation is particularly interested in candidates who are doing research with the potential to change the world for the better, and candidates with outreach experience.

Centre for Optimisation and Decision Science at Curtin University would be happy to embed such a scholar, and, as one of Australia’s leading optimisation research groups, offers exceptional alignment. Research Centre staff members include a Forrest PhD Fellow and a finalist for the postdoctoral fellowship, both of whom are happy to answer questions about the process. 

Information on this application round is available here.

Interested candidates are encouraged to contact Scott Lindstrom.

1 2 3 4 5 6 25