PhD Scholarship: Convergence Speed of Optimisation Algorithms


SCHOOL OF MATHEMATICS AND APPLIED STATISTICS (SMAS),
UNIVERSITY OF WOLLONGONG, AUSTRALIA


An exciting PhD scholarship is available in the School of Mathematics and Applied Statistics (SMAS) at the University of Wollongong, South Western Sydney campus, in the area of Optimisation. The title of the project is Determining the Convergence Speed of Derivative-free Optimisation Algorithms. The UOW scholarship is $28,092AUD tax-free per year for
three years full-time. Tuition fees (for up to 4 years) will be waived. The successful applicant will have the opportunity to work with both Australian and international collaborators, and extra funding may be available for conference travel. Applications are invited from domestic and international students who are able to commence PhD studies at the University of Wollongong in 2021. Applicants should hold, or be close to completing, an Honours 1 undergraduate degree or a Master’s degree in Applied
Mathematics, Computational Mathematics or a closely related field.

HOW TO APPLY
If you are interested in applying for this scholarship, please contact Dr Chayne Planiden via email: chayne@uow.edu.au.

Applications must include CV detailing previous education experience and academic transcripts. It is expected that the successful applicant will be available to commence this scholarship by 31 October 2021. Applications close 30 November, 2020.
MORE INFORMATION
Dr Chayne Planiden, Lecturer
School of Mathematics & Applied Statistics, University of Wollongong, NSW, Australia
Email: chayne@uow.edu.au

VA & Opt Webinar: Yalçın Kaya (UniSA)

Title: Constraint Splitting and Projection Methods for Optimal Control

Speaker: Yalçın Kaya (UniSA)

Date and Time: September 30th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: We consider a class of optimal control problems with constrained control variable. We split the ODE constraint and the control constraint of the problem so as to obtain two optimal control subproblems for each of which solutions can be written simply.  Employing these simpler solutions as projections, we find numerical solutions to the original problem by applying four different projection-type methods: (i) Dykstra’s algorithm, (ii) the Douglas–Rachford (DR) method, (iii) the Aragón Artacho–Campoy (AAC) algorithm and (iv) the fast iterative shrinkage-thresholding algorithm (FISTA).  The problem we study is posed in infinite-dimensional Hilbert spaces. Behaviour of the DR and AAC algorithms are explored via numerical experiments with respect to their parameters. An error analysis is also carried out numerically for a particular instance of the problem for each of the algorithms.  This is joint work with Heinz Bauschke and Regina Burachik.

VA & Opt Webinar: Regina Burachik (UniSA)

Title: A Primal–Dual Penalty Method via Rounded Weighted-L1 Lagrangian Duality

Speaker: Regina Burachik (UniSA)

Date and Time: September 23rd, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: We propose a new duality scheme based on a sequence of smooth minorants of the weighted-ℓ1 penalty function, interpreted as a parametrized sequence of augmented Lagrangians, to solve nonconvex constrained optimization problems. For the induced sequence of dual problems, we establish strong asymptotic duality properties. Namely, we show that (i) the sequence of dual problems is convex and (ii) the dual values monotonically increase to the optimal primal value. We use these properties to devise a subgradient based primal–dual method, and show that the generated primal sequence accumulates at a solution of the original problem. We illustrate the performance of the new method with three different types of test problems: A polynomial nonconvex problem, large-scale instances of the celebrated kissing number problem, and the Markov–Dubins problem. Our numerical experiments demonstrate that, when compared with the traditional implementation of a well-known smooth solver, our new method (using the same well-known solver in its subproblem) can find better quality solutions, i.e., “deeper” local minima, or solutions closer to the global minimum. Moreover, our method seems to be more time efficient, especially when the problem has a large number of constraints.

This is a joint work with C. Y. Kaya (UniSA) and C. J. Price (University of Canterbury, Christchurch, New Zealand)

VA & Opt Webinar: Christopher Price (University of Canterbury)

Title: A direct search method for constrained optimization via the rounded ℓ1 penalty function.

Speaker: Christopher Price (University of Canterbury)

Date and Time: September 16th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: This talk looks at the constrained optimization problem when the objective and constraints are Lipschitz continuous black box functions.   The approach uses a sequence of smoothed and offset ℓ1 penalty functions. The method generates an approximate minimizer to each penalty function, and then adjusts the offsets and other parameters. The smoothing is steadily reduced, ultimately revealing the ℓ1 exact penalty function. The method preferentially uses a discrete quasi-Newton step, backed up by a global direction search. Theoretical convergence results are given for the smooth and non-smooth cases subject to relevant conditions. Numerical results are presented on a variety of problems with non-smooth objective or constraint functions. These results show the method is effective in practice.

VA & Opt Webinar: Christiane Tammer (MLU)

Title: Subdifferentials and Lipschitz properties of translation invariant functionals and applications

Speaker: Christiane Tammer (MLU)

Date and Time: September 9th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: In the talk, we are dealing with translation invariant functionals and their application for deriving necessary conditions for minimal solutions of constrained and unconstrained optimization problems with respect to general domination sets.

Translation invariant functionals are a natural and powerful tool for the separation of not necessarily convex sets and scalarization. There are many applications of translation invariant functionals in nonlinear functional analysis, vector optimization, set optimization, optimization under uncertainty, mathematical finance as well as consumer and production theory.

The primary objective of this talk is to establish formulas for basic and singular subdifferentials of translation invariant functionals and to study important properties such as monotonicity, the PSNC property, the Lipschitz behavior, etc. of these nonlinear functionals without assuming that the shifted set involved in the definition of the functional is convex. The second objective is to propose a new way to scalarize a set-valued optimization problem. It allows us to study necessary conditions for minimal solutions in a very broad setting in which the domination set is not necessarily convex or solid or conical. The third objective is to apply our results to vector-valued approximation problems.

This is a joint work with T.Q. Bao (Northern Michigan University).

VA & Opt Webinar: Gerd Wachsmuth (BTU)

Title: New Constraint Qualifications for Optimization Problems in Banach Spaces based on Asymptotic KKT Conditions

Speaker: Gerd Wachsmuth (BTU)

Date and Time: September 2nd, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: Optimization theory in Banach spaces suffers from the lack of available constraint qualifications. Despite the fact that there exist only a very few constraint qualifications, they are, in addition, often violated even in simple applications. This is very much in contrast to finite-dimensional nonlinear programs, where a large number of constraint qualifications is known. Since these constraint qualifications are usually defined using the set of active inequality constraints, it is difficult to extend them to the infinite-dimensional setting. One exception is a recently introduced sequential constraint qualification based on asymptotic KKT conditions. This paper shows that this so-called asymptotic KKT regularity allows suitable extensions to the Banach space setting in order to obtain new constraint qualifications. The relation of these new constraint qualifications to existing ones is discussed in detail. Their usefulness is also shown by several examples as well as an algorithmic application to the class of augmented Lagrangian methods.

This is a joint work with Christian Kanzow (Würzburg) and Patrick Mehlitz (Cottbus).

VA & Opt Webinar: Jein-Shan Chen (NTNU)

Title: Two approaches for absolute value equation by using smoothing functions

Speaker: Jein-Shan Chen (NTNU)

Date and Time: August 26th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: In this talk, we present two approaches for solving absolute value equation.These two approaches are based on using some smoothing function. In particular, thereare several systematic ways of constructing smoothing functions. Numerical experimentswith comparisons are reported, which suggest what kinds of smoothing functions workwell along with the proposed approaches.

Invitation for Submissions to the Topical Collection “Mathematics of Computation and Optimisation” (MoCaO) in Advances in Computational Mathematics (ACOM)

Objective:
In an effort to promote interaction between researchers in Computational Mathematics and Optimization we plan a collection of articles in ACOM whose topics span both research fields.

The submissions should be done via ACOM submission system:

https://www.springer.com/journal/10444/updates/18256258

Guest Editors:
Jerome Droniou (Monash University, Melbourne, Australia)
Andrew Eberhard (RMIT, Melbourne, Australia)
Guoyin Li (University of New South Wales, Sydney, Australia)
Russell Luke (University of Goettingen, Germany)
Thanh Tran (University of New South Wales, Sydney, Australia)

The board is chaired by Thanh Tran.

Submission deadline: July 31, 2021.

VA & Opt Webinar: Hieu Thao Nguyen (TU Delft)

Title:  Projection Algorithms for Phase Retrieval with High Numerical Aperture

Speaker: Hieu Thao Nguyen (TU Delft)

Date and Time: August 19th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: We develop the mathematical framework in which the class of projection algorithms can be applied to high numerical aperture (NA) phase retrieval. Within this framework we first analyze the basic steps of solving this problem by projection algorithms and establish the closed forms of all the relevant prox-operators. We then study the geometry of the high-NA phase retrieval problem and the obtained results are subsequently used to establish convergence criteria of projection algorithms. Making use of the vectorial point-spread-function (PSF) is, on the one hand, the key difference between this work and the literature of phase retrieval mathematics which mostly deals with the scalar PSF. The results of this paper, on the other hand, can be viewed as extensions of those concerning projection methods for low-NA phase retrieval. Importantly, the improved performance of projection methods over the other classes of phase retrieval algorithms in the low-NA setting now also becomes applicable to the high-NA case. This is demonstrated by the accompanying numerical results which show that all available solution approaches for high-NA phase retrieval are outperformed by projection methods.

VA & Opt Webinar: Xiaoqi Yang (Hong Kong PolyU)

Title:  On error bound moduli for locally Lipschitz and regular functions

Speaker: Xiaoqi Yang (Hong Kong PolyU)

Date and Time: August 12th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: We first introduce for a closed and convex set two classes of subsets: the near and far ends relative to a point, and give some full characterizations for these end sets by virtue of the face theory of closed and convex sets. We provide some connections between closedness of the far (near) end and the relative continuity of the gauge (cogauge) for closed and convex sets. We illustrate that the distance from 0 to the outer limiting subdifferential of the support function of the subdifferential set, which is essentially the distance from 0 to the end set of the subdifferential set, is an upper estimate of the local error bound modulus. This upper estimate becomes tight for a convex function under some regularity conditions. We show that the distance from 0 to the outer limiting subdifferential set of a lower C^1 function is equal to the local error bound modulus.


References:
Li, M.H., Meng K.W. and Yang X.Q., On far and near ends of closed and convex sets. Journal of Convex Analysis. 27 (2020) 407–421.
Li, M.H., Meng K.W. and Yang X.Q., On error bound moduli for locally Lipschitz and regular functions, Math. Program. 171 (2018) 463–487.

1 16 17 18 19 20 25