MoCaO elections

Dear MoCaO members,

We need to undertake an election for the executive and this needs to be completed before (or soon after) the beginning of next year. For this election Andrew Eberhard and Alex Kruger will serve as returning officers (and consequently will not be seeking re-election).

 We are now seeking nomination for the following positions. All people nominated for these positions should be dual members of AustMS and MoCaO and should be nominated by two registered members of MoCaO.

1. Chair (Optimisation)

2. Chair (Computation)

3. Secretary

4. Treasurer

5. Communications-Web manager

6. Two ordinary members of the executive

Please send your nomination by email to MoCaO@austms.org.au

An online system will be set up and details on how to cast your vote online will be send in a follow up email.  

The closing date for nominations will be December the 20th, 2024 after which we will initiate a vote in the following weeks. Probably in the early new year.

On Australian Research Council funding over the last 10 years

A .docx copy is available here:

Mathematics of Computation and Optimisation (MoCaO) (www.mocao.org) represents more than 250 Australian mathematicians involved in the development of modern computational techniques for modern data science, machine learning and physical modelling. These computational techniques are the vital engines that power the software utilized by researchers in the latter fields (the analogy of maths engines powering software ‘vehicles’ will be used throughout this report). MoCaO is concerned that the funding opportunities and the amounts funded for critical fundamental research have been in decline for many decades.
The absolute amount of funding has not risen in dollar adjusted terms in this period while on the other hand, here has also been an increase in the number of funded grant streams supported by the government.

Australian Mathematical Sciences Institute (members include Australian Signals Directorate, RBA, and 28 universities) and MATRIX Institute have recently published the report Research Investment and Expenditure into the Mathematical Sciences. This report highlights critical shortfalls in relation to funding of research in the mathematical sciences. Central to these concerns is that funding for the mathematical sciences (the engines) has decreased, as more funding has been diverted to applied research (the projects made possible by the software ‘vehicles’).  This is to the direct detriment of fundamental research, and to the broader detriment of all, since large-scale modern problems faced by Australian industry require ever more powerful mathematical engines. The report also notes that “ARC investment in the schemes most relevant to the Mathematical Sciences … is roughly on par with investment in other STEMM disciplines, except for a noticeable drop in DECRAs.” Despite this being true in relative terms (i.e. in relation to overall funding in ARC Discovery grants etc) MoCaO remains concerned, as a representative body for the computational mathematical sciences, that this assessment hides the reduction of funding in absolute terms during this last 10year period in funding of ARC DP grants. The decline in DECRA funding is particularly alarming, as it could accelerate the exit of new talent from research. At AMSI Summer School Careers Day, companies routinely advertise salaries in excess of $250,000 for students with PhDs in computational mathematics.

In the table below, the total amount (in AUD) of ARC/DP grants in all areas of mathematics per year over the last decade is shown. The number at each row represents the total value awarded to ARD/DP grants that year with FoR codes 0101 (Pure Mathematics, 0102 (Applied Mathematics) and 0103 (Numerical and Computational Mathematics).  We then used the Reserve Bank of Australia’s site to convert the figures to the current dollar values.
The data, in today dollars, is displayed in Figure 1. In ten years, the funding for ARC/DP grants has been essentially halved.

YearAmount in AUD
2022  9929000
202110462488
202011593438
201911178546
201812240632
2017  9063642
201612816771
201511162100
201414582737
201312386233
201216461942
201115495628

Total amount of ARC/DP grants in pure/applied/computational maths. Data extracted from “Yearly_funding_allocation_Nov2022.xlsx” which is available publicly at https://www.arc.gov.au/funding-research/funding-outcome/grants-dataset

         Fig 1. Total ARC/DP funding for mathematics in today dollars.

The executive summary of the AMSI-MATRIX report  “Research Investment and Expenditure into the Mathematical Sciences” states: “Basic research should be properly funded for Australia’s long term prosperity. In addition, it is essential that research facilities in the Mathematical Sciences are well supported as National Research Infrastructure.” MoCaO emphasizes that the reduced funding opportunities in ARC\DP and DECRA grants is having a particularly detrimental impact on the career opportunities for younger emerging researchers in the mathematical sciences in Australia. This has the potential to inflict longer term damage to Australia’s international standing in the Mathematical Sciences.

MOCAO lectures: July, 18

Dear MOCAO lectures participants.

All the videos will be available on our YouTube channel later, there will be an additional announcement.

Our next speaker (Thur, July 18, 11AM AEST) is Dr Isabelle Shankar.

Title: The dual of a convex body

Abstract: Duality in convex geometry arises naturally in optimization by asking a simple question: given a maximizer to a convex optimization problem, how can we certify that it is indeed optimal? This leads to the definition of normal cone and quickly after to the dual of a convex body, which is itself a convex body. We’ll discuss examples including polytopes and extend the theory to conic duality.

Zoom:

https://unsw.zoom.us/j/89239495027?pwd=zMmvAkAmFZVQ4SzBhQkz6CuH4nuxwu.1

Registration for urgent updates: https://forms.gle/CGhNt3bssmqLMXcj6

Looking forward to seeing you tomorrow.

Nadia

Lecturer in Data Science – University of Sydney

The School of Mathematics and Statistics at the University of Sydney is recruiting a Lecturer in Data Science for a fixed term contract until 2027. Applicants with an interest in mathematical optimisation, data science and/or applied mathematics are encouraged to apply. 

Interviews will be held in late June. For more information and to apply, please visit this page: https://usyd.wd3.myworkdayjobs.com/en-US/USYD_EXTERNAL_CAREER_SITE/job/Lecturer-in-Data-Science_0117330-2

Computational Techniques and Applications Conference: Early bird registration closes on 1 June and abstract submission closes on 1 October.

The 2024 iteration of the biennial Computational Techniques and Applications Conference (CTAC) will be held at Monash University from 19 to 22 November 2024. The CTAC meetings are the main event in numerical mathematics and scientific computing in Australia, and have been taking place biennially since 1981, the most recent being held in 2022 at QUT. 

As organisers of the numerical optimisation stream, we would like to invite you to submit an abstract for a contributed talk in this stream. We are interested in talks related to continuous optimisation, including topics with potential links to other CTAC streams (e.g. machine learning, inverse problems, uncertainty quantification, numerical PDEs). Please feel free to extend this invitation to others who may be interested in participating, including students/postdocs and visitors. There is a prize for best student talk.

Early bird registration closes on 1 June and abstract submission closes on 1 October.

The conference webpage is here:

https://www.monash.edu/science/schools/school-of-mathematics/ctac2024

CTAC meeting 2024

The CTAC meetings are the main event in numerical mathematics and scientific computing in Australia, and have been taking place biennially since 1981, the most recent being held in 2022 at Queensland UT. This year’s event features a combination of plenary/keynote and contributed talks. 

Confirmed plenary/keynote speakers:

Santiago Badia, Monash U

Fleurianne Bertrand, TU Chemnitz

Victor Calo, Curtin U

Carsten Carstensen, Humboldt-U Berlin

Vivien Challis, QUT

Nilima Nigam, Simon Fraser U

Vijay Rajagopal, U Melbourne

Dingxuan Zhou, U Sydney

Early bird registration closes on June 1, 2024 and the abstract submission closes on October 1, 2024. All necessary information about the conference and the submission process can be accessed from 

https://www.monash.edu/science/schools/school-of-mathematics/ctac2024

Further inquires can be directed to the organising committee at ctac2024@monash.edu

SigmaOpt, the optimisation special interest group of ANZIAM, is holding a one-day workshop in
Adelaide City on the day after the 2024 ANZIAM Conference (in Adelaide Hills). The workshop
will feature talks from five invited speakers as well as the Winner of the Student Best Paper Prize.
Important Information:

  • When: Friday February 16, 2024.
  • Where: Room RR5-09 at the City West Campus of UniSA.
  • Invited Speakers:
    – Kate Helmstedt (QUT)
    – Yalcin Kaya (UniSA)
    – Vicky Mak (Deakin)
    – Lindon Roberts (USyd)
    – Golbon Zakeri (UMas Amherst, USA)
    – The Winner of the Student Best Paper Prize (TBA) – see call below
  • Registration: $50 (includes catering) using the link below.
    SigmaOpt/MoCaO Student Best Paper Prize: SigmaOpt and MoCaO call for nominations
    for the joint SigmaOpt/MoCaO Student Best Paper Prize for an exceptional paper in the field of
    mathematical optimisation, optimal control, operations research or related field published in the
    last 18 months. The winner of this prize will be awarded $300 and invited to present the paper at
    the one-day workshop. For information on submitting a nomination, visit:
    https://www.anziam.org.au/SIGMAOPT
  • For more information, please refer to the link.

YouTube Channel update

The link to our YouTube channel was updated (main menu).

The channel contains the following Playlists:

MoCaO lecture series (2023 and 2022)

WOMBAT (2020 and 2021)

VAOpt Webinar (Seasons 1-4).

PhD Scholarship Opportunity

A Phd Scholarship in Mathematical Optimisation is available working with Dr. Scott B. Lindstrom at Centre for Optimisation and Decision Science, Curtin University. Please share this opportunity with any students who may be interested.

Eligibility

You must be one of the following:

  • Australian Citizen
  • Australian Permanent Resident
  • New Zealand Citizen
  • Permanent Humanitarian Visa

Deadline: no expressions of interest will be accepted after 25th August, 2023. This is a University deadline for all projects in this category, and it cannot be extended. Would-be applicants are strongly encouraged to submit their expressions of interest well in advance of this deadline.

Overview

The annual scholarship package (stipend and tuition fees) is approx. $60,000 – $70,000 p.a.

Successful HDR applicants for admission will receive a 100% fee offset for up to 4 years, stipend scholarships at the 2023 RTP rate valued at $32,250 p.a. for up to a maximum of 3 years, with a possible 6-month completion scholarship. Applicants are determined via a competitive selection process and will be notified of the scholarship outcome in November 2023. 

The official advertisement for the position is here, and more information is below.
For detailed information about RTP Scholarships, visit: Research Training Program (RTP) Scholarships | Curtin University, Perth, Australia.

Description

In data science, machine learning, and engineering, many problems take the form of finding a solution that minimizes a cost, subject to constraints on allowable solutions. Some examples of costs include expected financial losses, model prediction errors, and energy used. Some examples of constraints include resource limitations, minimum requirements on what is produced, and so forth.

These problems are solved with operator splitting methods, a modern class of non-linear optimisation algorithms that allow the constraint structure and cost structure to be treated as two components of a single unifying function. These algorithms were independently discovered by mathematicians working on physics and imaging problems, and they have been developed and improved with the powerful machinery of convex analysis.  

For many important problems, we desire to make these algorithms go faster, either to find solutions within the maximum time allowable (for example: balancing power flow in electricity grids) or to make better data science models computationally tractable for large data sets. Researchers have recently turned to studying the dynamical systems associated with operator splitting methods. This research is allowing us to prove results in nonconvex settings and build new algorithms. Dr. Scott Lindstrom recently introduced a meta-algorithm that uses operator dynamics to suggest alternative algorithm updates. The intent of this meta-algorithm is to solve surrogates for something called a Lyapunov function, which is an object that describes the dynamics. This meta-algorithm has already become state-of-the-art for finding wavelets with structural constraints (an imaging sciences problem). 

Scientific Aims: The scientific aim of this project is to identify classes of problems in data science, machine learning, and engineering for which meta-algorithms—such as the one described above—may be deliver superior performance. The approach will be multi-faceted, combining both computational experiment and rigorous proof. The results will be communicated in articles and submitted to peer-reviewed publications.

Upskilling Aims: The upskilling aims for the selected candidate are as follows (in no particular order). The candidate will build expertise in the algorithms that make it possible to solve many modern data science models and engineering problems, understanding both how the algorithms are designed, how geometry informs model selection, and what the outstanding challenges are. At the project’s completion, the candidate will be competent to rigorously conduct both experimental and theoretical mathematics research, and to communicate the results of their discoveries to others in the field. 

In the literature review component, you will learn the fundamental theory—convex analysis—of operator splitting and learn how operator splitting algorithms are formulated for solving various classes of problems. Some examples of the types of problems you will study are as follows: (1) least absolute deviations for outlier-resistant linear regression (a data science modelling problem), (2) progressive hedging for maximizing expected earnings (a finance problem), (3) computation of a one-norm centroid (a statistics problem), and (4) phase retrieval (a signal processing problem). 

In the experimental component, you will apply Lyapunov surrogate methods to solve those problems. You will build performance profiles, which are visualizations that allow researchers to compare the speeds of different algorithms. 

In the theoretical component, you will formally analyse the dynamical systems associated with operator splitting methods when they are applied to these problem classes. Particular emphasis will be placed on the duality of algorithms; duality is a fundamental concept in convex analysis.

You will document and communicate the findings in written articles. 

Background: In 2015, Australian researchers Jonathan M. Borwein and Brailey Sims seeded the study of dynamical systems for operator splitting methods. This has rapidly grown into an expansive field in its own right. A Google Scholar search for “Dynamical Systems and ADMM” returns 17,100 results in the last 8 years, with applications including distributed DC optimal power flow. The principal of this project, Dr. Scott Lindstrom, worked extensively with Borwein and Sims, and was one of the 41 invited participants at the 2017 Banff International Research Station workshop for the world’s leading experts on splitting algorithms. Together with Walaa Moursi (Waterloo) and Matthew K. Tam (Melbourne), he is co-organizing the 2025 Australian Matrix workshop on operator splitting. 

Dr. Lindstrom’s article introducing the Lyapunov Surrogate method published in Computational Optimization and Applications, a top journal in the field of optimization. The meta-algorithm has already been demonstrated state-of-the-art for finding structured wavelets, in research with Dr.s Lindstrom, Neil Dizon (U. Helsinki) and Jeffrey Hogan (U. Newcastle). 

Future context: As described in the overview, in the here and now, faster operator splitting methods will allow us to obtain better solutions to important problems in data science and energy. On a ten year horizon, this research advances an emerging new paradigm in problem solving, where artificial intelligence will observe an algorithm’s performance and suggest on-the-fly parameter adjustments and alternative updates for the iterates. Finally, the project builds fundamental knowledge in the mathematical sciences and equips the selected candidate with a skill set of extreme contemporary demand. 

Internship Opportunities: This project may provide an internship opportunity.

Interested applicants: should submit the expression of interest form on this page. Questions may be directed to Dr. Scott B. Lindstrom ( scott.lindstrom@curtin.edu.au ).

1 2 3 4