PhD Scholarship Opportunity

A Phd Scholarship in Mathematical Optimisation is available working with Dr. Scott B. Lindstrom at Centre for Optimisation and Decision Science, Curtin University. Please share this opportunity with any students who may be interested.

Eligibility

You must be one of the following:

  • Australian Citizen
  • Australian Permanent Resident
  • New Zealand Citizen
  • Permanent Humanitarian Visa

Deadline: no expressions of interest will be accepted after 25th August, 2023. This is a University deadline for all projects in this category, and it cannot be extended. Would-be applicants are strongly encouraged to submit their expressions of interest well in advance of this deadline.

Overview

The annual scholarship package (stipend and tuition fees) is approx. $60,000 – $70,000 p.a.

Successful HDR applicants for admission will receive a 100% fee offset for up to 4 years, stipend scholarships at the 2023 RTP rate valued at $32,250 p.a. for up to a maximum of 3 years, with a possible 6-month completion scholarship. Applicants are determined via a competitive selection process and will be notified of the scholarship outcome in November 2023. 

The official advertisement for the position is here, and more information is below.
For detailed information about RTP Scholarships, visit: Research Training Program (RTP) Scholarships | Curtin University, Perth, Australia.

Description

In data science, machine learning, and engineering, many problems take the form of finding a solution that minimizes a cost, subject to constraints on allowable solutions. Some examples of costs include expected financial losses, model prediction errors, and energy used. Some examples of constraints include resource limitations, minimum requirements on what is produced, and so forth.

These problems are solved with operator splitting methods, a modern class of non-linear optimisation algorithms that allow the constraint structure and cost structure to be treated as two components of a single unifying function. These algorithms were independently discovered by mathematicians working on physics and imaging problems, and they have been developed and improved with the powerful machinery of convex analysis.  

For many important problems, we desire to make these algorithms go faster, either to find solutions within the maximum time allowable (for example: balancing power flow in electricity grids) or to make better data science models computationally tractable for large data sets. Researchers have recently turned to studying the dynamical systems associated with operator splitting methods. This research is allowing us to prove results in nonconvex settings and build new algorithms. Dr. Scott Lindstrom recently introduced a meta-algorithm that uses operator dynamics to suggest alternative algorithm updates. The intent of this meta-algorithm is to solve surrogates for something called a Lyapunov function, which is an object that describes the dynamics. This meta-algorithm has already become state-of-the-art for finding wavelets with structural constraints (an imaging sciences problem). 

Scientific Aims: The scientific aim of this project is to identify classes of problems in data science, machine learning, and engineering for which meta-algorithms—such as the one described above—may be deliver superior performance. The approach will be multi-faceted, combining both computational experiment and rigorous proof. The results will be communicated in articles and submitted to peer-reviewed publications.

Upskilling Aims: The upskilling aims for the selected candidate are as follows (in no particular order). The candidate will build expertise in the algorithms that make it possible to solve many modern data science models and engineering problems, understanding both how the algorithms are designed, how geometry informs model selection, and what the outstanding challenges are. At the project’s completion, the candidate will be competent to rigorously conduct both experimental and theoretical mathematics research, and to communicate the results of their discoveries to others in the field. 

In the literature review component, you will learn the fundamental theory—convex analysis—of operator splitting and learn how operator splitting algorithms are formulated for solving various classes of problems. Some examples of the types of problems you will study are as follows: (1) least absolute deviations for outlier-resistant linear regression (a data science modelling problem), (2) progressive hedging for maximizing expected earnings (a finance problem), (3) computation of a one-norm centroid (a statistics problem), and (4) phase retrieval (a signal processing problem). 

In the experimental component, you will apply Lyapunov surrogate methods to solve those problems. You will build performance profiles, which are visualizations that allow researchers to compare the speeds of different algorithms. 

In the theoretical component, you will formally analyse the dynamical systems associated with operator splitting methods when they are applied to these problem classes. Particular emphasis will be placed on the duality of algorithms; duality is a fundamental concept in convex analysis.

You will document and communicate the findings in written articles. 

Background: In 2015, Australian researchers Jonathan M. Borwein and Brailey Sims seeded the study of dynamical systems for operator splitting methods. This has rapidly grown into an expansive field in its own right. A Google Scholar search for “Dynamical Systems and ADMM” returns 17,100 results in the last 8 years, with applications including distributed DC optimal power flow. The principal of this project, Dr. Scott Lindstrom, worked extensively with Borwein and Sims, and was one of the 41 invited participants at the 2017 Banff International Research Station workshop for the world’s leading experts on splitting algorithms. Together with Walaa Moursi (Waterloo) and Matthew K. Tam (Melbourne), he is co-organizing the 2025 Australian Matrix workshop on operator splitting. 

Dr. Lindstrom’s article introducing the Lyapunov Surrogate method published in Computational Optimization and Applications, a top journal in the field of optimization. The meta-algorithm has already been demonstrated state-of-the-art for finding structured wavelets, in research with Dr.s Lindstrom, Neil Dizon (U. Helsinki) and Jeffrey Hogan (U. Newcastle). 

Future context: As described in the overview, in the here and now, faster operator splitting methods will allow us to obtain better solutions to important problems in data science and energy. On a ten year horizon, this research advances an emerging new paradigm in problem solving, where artificial intelligence will observe an algorithm’s performance and suggest on-the-fly parameter adjustments and alternative updates for the iterates. Finally, the project builds fundamental knowledge in the mathematical sciences and equips the selected candidate with a skill set of extreme contemporary demand. 

Internship Opportunities: This project may provide an internship opportunity.

Interested applicants: should submit the expression of interest form on this page. Questions may be directed to Dr. Scott B. Lindstrom ( scott.lindstrom@curtin.edu.au ).

Forrest Fellowships

The Forrest Foundation is now accepting applications for Forrest Fellows. These Perth-based postdoctoral positions are highly competitive, equally prestigious, and pay for a fellow to embed in one of Perth’s universities. The foundation is particularly interested in candidates who are doing research with the potential to change the world for the better, and candidates with outreach experience.

Centre for Optimisation and Decision Science at Curtin University would be happy to embed such a scholar, and, as one of Australia’s leading optimisation research groups, offers exceptional alignment. Research Centre staff members include a Forrest PhD Fellow and a finalist for the postdoctoral fellowship, both of whom are happy to answer questions about the process. 

Information on this application round is available here.

Interested candidates are encouraged to contact Scott Lindstrom.

MoCaO 2023 Election Results

Dear MoCaO members,

We now have the final results of the elections of the group’s executive members. The elected outcomes are as follows:

Webmaster: Nadia Sukhorukova 

Ordinary Members: Lindon Roberts and Guoyin Li

Because there was only one candidate for each of the remaining positions, they were automatically filled. They are as follows:

Optimisation Co-chair: Andrew Eberhard

Computation Co-chair: Quoc Thong Le Gia

Secretary: Scott Lindstrom       

Treasurer: Alex Kruger

A warm thank you to everyone who volunteered to serve.

–Scott B. Lindstrom (secretary)

WOMBAT Deadlines Closing Soon

Registration is closing soon for the next Workshop on Optimisation, Metric Bounds, Approximation and Transversality (WoMBaT2022). The event will be held 13–15 (+16) December 2022 in beautiful Perth, Western Australia. Support for students and carers is also available, and applications for both are also closing soon.

Note the following deadlines:

28 October: is the deadline for the student support scheme applications. In addition, the abstracts submitted before this date will be given consideration before the abstracts submitted after this date.
12 November: The final deadline to submit a talk.
26 November: The deadline to register if you are attending only and not giving a talk.

This iteration of wombat will be hosted at Curtin University by Curtin Centre for Optimisation and Decision Science. The event is expected to take place as an entirely in-person event.

In addition, this year WOMBAT in collaboration with ARC Training Centre for Transforming Maintenance through Data Science, will host a special day (16th December 2022) focused on the applications of optimization on planning and scheduling maintenance in the resource industry. The schedule is now up.

Register now! We look forward to welcoming you to Perth!

On behalf of the local organizing committee

Dr. Hoa T. Bui, Curtin Centre for Optimisation and Decision Science
Dr. Scott B. Lindstrom, Curtin Centre for Optimisation and Decision Science

WOMBAT 2022 Registration now open

Registration is now open for the next Workshop on Optimisation, Metric Bounds, Approximation and Transversality (WoMBaT2022). The event will be held 13–15 (+16) December 2022 in beautiful Perth, Western Australia. Support for students and carers is also available, and applications are open for that as well.

This iteration of wombat will be hosted at Curtin University by Curtin Centre for Optimisation and Decision Science. The event is expected to take place as an entirely in-person event.

In addition, this year WOMBAT in collaboration with ARC Training Centre for Transforming Maintenance through Data Science, will host a special day (16th December 2022) focused on the applications of optimization on planning and scheduling maintenance in the resource industry. We are very excited for this special event!

Register now! We look forward to welcoming you to Perth!

On behalf of the local organizing committee

Dr. Hoa T. Bui, Curtin Centre for Optimisation and Decision Science
Dr. Scott B. Lindstrom, Curtin Centre for Optimisation and Decision Science

Save the date: WOMBAT 2022

Save the dates! The next Workshop on Optimisation, Metric Bounds, Approximation and Transversality (WoMBaT2022) will be held 13–16 December 2022 in beautiful Perth, Western Australia. This iteration of wombat will be hosted at Curtin University by Curtin Centre for Optimisation and Decision Science. The event is expected to take place as an entirely in-person event. Registration will open and plenaries will be announced soon.

In addition, this year WOMBAT in collaboration with ARC Training Centre for Transforming Maintenance through Data Science, will host a special day (16th December 2022) focused on the applications of optimization on planning and scheduling maintenance in the resource industry. We are very excited for this special event!

Save the dates! We look forward to welcoming you to Perth!

On behalf of the local organizing committee

Dr. Hoa T. Bui, Curtin Centre for Optimisation and Decision Science
Dr. Scott B. Lindstrom, Curtin Centre for Optimisation and Decision Science

AMSI ACE Optimisation Course: Theory and Methods of Modern Optimisation

The 2022 AMSI ACE network Optimisation course will launch on February 28, 2022. This year’s theme is “Theory and Methods of Modern Optimisation,” and the instructors are Prof. Andrew Eberhard (RMIT University) and Dr. Scott B. Lindstrom (Curtin Centre for Optimisation and Decision Science). Details and pre-enrolment quiz are currently available on the AMSI website. Please share with any students who might benefit from this opportunity.

Alex Rubinov Memorial Oration 2021: please register


Alzheimer’s Disease:
New Approach for Early Indication by Voxel-(C)MARS –
Optimization and Operational Research in Big-Data of us Humans
Professor Gerhard-Wilhelm Weber

7:00 pm Thursday 9 December
via MS Teams
Each year a public oration is held to commemorate the life of Professor Alexander Rubinov and to celebrate his contribution to Federation University as founding Director of the Centre for Informatics and Applied Optimisation.


Click here to more details and registration instructions.

Please note that you have to register.

LION15: The 15th Learning and Intelligent Optimization Conference, Athens, Greece, June 20-25, 2021

LION15: The 15th Learning and Intelligent Optimization Conference
Athens, Greece, June 20-25, 2021
—————————————————————
Conference website https://lion15.sba-research.org/index.html
Submission link https://easychair.org/conferences/?conf=lion15
Extended Submission deadline March 15, 2021

The large variety of heuristic algorithms for hard optimization problems raises numerous interesting and challenging issues. Practitioners using heuristic algorithms for hard optimization problems are confronted with the burden of selecting the most appropriate method, in many cases through expensive algorithm configuration and parameter tuning. Scientists seek theoretical insights and demand a sound experimental methodology for evaluating algorithms and assessing strengths and weaknesses. This effort requires a clear separation between the algorithm and the experimenter, who, in too many cases, is “in the loop” as a motivated intelligent learning component. LION deals with designing and engineering ways of “learning” about the performance of different techniques, and ways of using past experience about the algorithm behavior to improve performance in the future. Intelligent learning schemes for mining the knowledge obtained online or offline can improve the algorithm design process and simplify the applications of high-performance optimization methods. Combinations of different algorithms can further improve the robustness and performance of the individual components.

This meeting explores the intersections and uncharted territories between machine learning, artificial intelligence, energy, mathematical programming and algorithms for hard optimization problems. The main purpose of the event is to bring together experts from these areas to discuss new ideas and methods, challenges and opportunities in various application areas, general trends and specific developments. We are excited to be bringing the LION conference in Greece for the third time.

1 2 3 4