The Euler International Mathematical Institute in St. Petersburg is seeking postdocs in all areas of Mathematics, Theoretical Computer Science, Mathematical and Theoretical Physics.

Call for Postdocs 2020

The Euler International Mathematical Institute in St. Petersburg is seeking postdocs in all areas of Mathematics, Theoretical Computer Science, Mathematical and Theoretical Physics.

Applicants should send their applications to euler.postdoc@gmail.com

The applications should include:

  • CV,
  • List of publications (including preprints, if necessary),
  • Description of research interests, ideally mentioning possible host or other research contacts in St.Petersburg,
  • Names, affiliations and contacts of 2-3 people willing to send recommendation letters if asked by the committee,
  • Any special requirements wrt the dates, etc.

Basic conditions:

  • Competitive salary of 126,314 RUB per month (taxed at 13% for residents and foreigners), this is double the average salary in St. Petersburg,
  • Housing allowance enough to cover all or most of the rent (in addition to the salary),
  • The institute partially covers travelling expenses to St. Petersburg of up to 300 Euro for the postdocs from Europe and up to 600 Euro for the postdocs outside Europe,
  • The institute has some funds for covering participation in conferences that cannot be covered from other sources,
  • 1 or 2 years extendable for another year,
  • Small teaching load,
  • Flexibility with respect to the starting date, length, specific calendar requirements (such as a leave in the middle),

St. Petersburg is the most beautiful city in the world and has multiple mathematical locations including Steklov Institute of Mathematics http://www.pdmi.ras.ru/pdmi/en/laboratories and the newly created Department of Mathematics and Computer Science in St. Petersburg State University http://math-cs.spbu.ru/en/people/ (the links are provided also as a “menu” of possible hosts).

The preference is given to applications completed before November 30, 2020. Preferable starting date is September 1, 2021.

If you have questions, please do not hesitate to ask them by email.

VA & Opt Webinar: Chayne Planiden (UoW)

Title: New Gradient and Hessian Approximation Methods for Derivative-free Optimisation

Speaker: Chayne Planiden (UoW)

Date and Time: November 4th, 2020, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: In general, derivative-free optimisation (DFO) uses approximations of first- and second-order information in minimisation algorithms. DFO is found in direct-search, model-based, trust-region and other mainstream optimisation techniques and is gaining popularity in recent years. This work discusses previous results on some particular uses of DFO: the proximal bundle method and the VU-algorithm, and then presents improvements made this year on the gradient and Hessian approximation techniques. These improvements can be inserted into any routine that requires such estimations.

VA & Opt Webinar: Radek Cibulka (University of West Bohemia)

Title: Continuous selections for inverse mappings in Banach spaces

Speaker: Radek Cibulka (University of West Bohemia)

Date and Time: October 28th, 2020, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: Influenced by a recent work by A. V. Arutyunov, A. F. Izmailov,  and S. E. Zhukovskiy,  we establish a general Ioffe-type criterion guaranteeing the existence of a continuous and calm selection for the inverse of a single-valued uniformly continuous mapping between Banach spaces with  a closed domain.  We show that the general statement yields elegant proofs  following  the same pattern as in the case of the usual openness with a linear rate by  considering mappings instead of points. As in the case of the Ioffe’s criterion for linear openness around the reference point, this allows us to avoid the iteration, that is, the construction of a sequence of continuous functions  the limit of which is the desired continuous selection for the inverse mapping, which is illustrated on the proof of the Bartle-Graves theorem.  Then we formulate sufficient conditions based on approximations given by positively homogeneous mappings and bunches of linear operators. The talk is based on a joint work with Marián Fabian.

VA & Opt Webinar: Wilfredo Sosa (UCB)

Title:On diametrically maximal sets, maximal premonotone maps and promonote bifunctions

Speaker: Wilfredo Sosa (UCB)

Date and Time: October 21st, 2020, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: First, we study diametrically maximal sets in the Euclidean space (those which are not properly contained in a set with the same diameter), establishing their main properties. Then, we use these sets for exhibiting an explicit family of maximal premonotone operators. We also establish some relevant properties of maximal premonotone operators, like their local boundedness, and finally we introduce the notion premonotone bifunctions, presenting a canonical relation between premonotone operators and bifunctions, that extends the well known one, which holds in the monotone case.

VA & Opt Webinar: Björn Rüffer (UoN)

Title: A Lyapunov perspective to projection algorithms

Speaker: Björn Rüffer (UoN)

Date and Time: October 14th, 2020, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: The operator theoretic point of view has been very successful in the study of iterative splitting methods under a unified framework. These algorithms include the Method of Alternating Projections as well as the Douglas-Rachford Algorithm, which is dual to the Alternating Direction Method of Multipliers, and they allow nice geometric interpretations. While convergence results for these algorithms have been known for decades when problems are convex, for non-convex problems progress on convergence results has significantly increased once arguments based on Lyapunov functions were used. In this talk we give an overview of the underlying techniques in Lyapunov’s direct method and look at convergence of iterative projection methods through this lens.

VA & Opt Webinar: Reinier Diaz Millan (Deakin)

Title: An algorithm for pseudo-monotone operators with application to rational approximation

Speaker: Reinier Diaz Millan (Deakin)

Date and Time: October 7th, 2020, 17:00 AEDT (Register here for remote connection via Zoom)

Abstract: The motivation of this paper is the development of an optimisation method for solving optimisation problems appearing in Chebyshev rational and generalised rational approximation problems, where the approximations are constructed as ratios of linear forms (linear combination of basis functions). The coefficients of the linear forms are subject to optimisation and the basis functions are continuous function. It is known that the objective functions in generalised rational approximation problems are quasi-convex. In this paper we also prove a stronger result, the objective functions are pseudo-convex. Then we develop numerical methods, that are efficient for a wide range of pseudo-convex functions and test them on generalised rational approximation problems.

Dynamic Control and Optimization conference, 3-5 February 2021, Aveiro, Portugal

The Dynamic Control and Optimization International Conference 2021 (DCO 2021) will take place at University of Aveiro, Campus of Santiago, from 3 to 5 February, 2021 (Wednesday-Friday). The conference is dedicated to the 65th birthday of Andrey V. Sarychev, Professor in University of Aveiro until 2002.

For more information please visit the conference site https://sites.google.com/view/dco2021/.

The conference is organized by the Department of Mathematics of the University of Aveiro, the Center for Research & Development in Mathematics and Applications (CIDMA, University of Aveiro) and the Center for Applied Mathematics and Economics (CEMAPRE, University of Lisbon).

Aims and Scope

  • nonlinear dynamical control systems,
  • control of evolution PDE,
  • optimal control,
  • sub-Riemannian geometry,
  • ordinary differential equations,
  • calculus of variations,
  • differential equations,
  • propagation of acoustic waves in elastic media.

The conference will consist of invited plenary talks (40 minutes + 5 minutes of q&a) and contributed paper presentations (20 minutes + 5 minutes of q&a).

Selected full papers of this conference will be published in an international journal.

Due to the COVID-19 situation and the possible lockdown, the conference is planned to be held in two simultaneous ways:

  • attended at the University of Aveiro, Aveiro, Portugal;
  • virtually using Zoom

To ensure that all conference attendees have access to all sessions, it is planned to transmit the sessions online or provide access to their recordings.

Regardless of how we handle the conference, we want to assure everyone that there will be a forum for contributors to share their work and participants to learn and network.

(submitted by Vera Roshchina on behalf of conference organisers)

PhD Scholarship: Convergence Speed of Optimisation Algorithms


SCHOOL OF MATHEMATICS AND APPLIED STATISTICS (SMAS),
UNIVERSITY OF WOLLONGONG, AUSTRALIA


An exciting PhD scholarship is available in the School of Mathematics and Applied Statistics (SMAS) at the University of Wollongong, South Western Sydney campus, in the area of Optimisation. The title of the project is Determining the Convergence Speed of Derivative-free Optimisation Algorithms. The UOW scholarship is $28,092AUD tax-free per year for
three years full-time. Tuition fees (for up to 4 years) will be waived. The successful applicant will have the opportunity to work with both Australian and international collaborators, and extra funding may be available for conference travel. Applications are invited from domestic and international students who are able to commence PhD studies at the University of Wollongong in 2021. Applicants should hold, or be close to completing, an Honours 1 undergraduate degree or a Master’s degree in Applied
Mathematics, Computational Mathematics or a closely related field.

HOW TO APPLY
If you are interested in applying for this scholarship, please contact Dr Chayne Planiden via email: chayne@uow.edu.au.

Applications must include CV detailing previous education experience and academic transcripts. It is expected that the successful applicant will be available to commence this scholarship by 31 October 2021. Applications close 30 November, 2020.
MORE INFORMATION
Dr Chayne Planiden, Lecturer
School of Mathematics & Applied Statistics, University of Wollongong, NSW, Australia
Email: chayne@uow.edu.au

VA & Opt Webinar: Yalçın Kaya (UniSA)

Title: Constraint Splitting and Projection Methods for Optimal Control

Speaker: Yalçın Kaya (UniSA)

Date and Time: September 30th, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: We consider a class of optimal control problems with constrained control variable. We split the ODE constraint and the control constraint of the problem so as to obtain two optimal control subproblems for each of which solutions can be written simply.  Employing these simpler solutions as projections, we find numerical solutions to the original problem by applying four different projection-type methods: (i) Dykstra’s algorithm, (ii) the Douglas–Rachford (DR) method, (iii) the Aragón Artacho–Campoy (AAC) algorithm and (iv) the fast iterative shrinkage-thresholding algorithm (FISTA).  The problem we study is posed in infinite-dimensional Hilbert spaces. Behaviour of the DR and AAC algorithms are explored via numerical experiments with respect to their parameters. An error analysis is also carried out numerically for a particular instance of the problem for each of the algorithms.  This is joint work with Heinz Bauschke and Regina Burachik.

VA & Opt Webinar: Regina Burachik (UniSA)

Title: A Primal–Dual Penalty Method via Rounded Weighted-L1 Lagrangian Duality

Speaker: Regina Burachik (UniSA)

Date and Time: September 23rd, 2020, 17:00 AEST (Register here for remote connection via Zoom)

Abstract: We propose a new duality scheme based on a sequence of smooth minorants of the weighted-ℓ1 penalty function, interpreted as a parametrized sequence of augmented Lagrangians, to solve nonconvex constrained optimization problems. For the induced sequence of dual problems, we establish strong asymptotic duality properties. Namely, we show that (i) the sequence of dual problems is convex and (ii) the dual values monotonically increase to the optimal primal value. We use these properties to devise a subgradient based primal–dual method, and show that the generated primal sequence accumulates at a solution of the original problem. We illustrate the performance of the new method with three different types of test problems: A polynomial nonconvex problem, large-scale instances of the celebrated kissing number problem, and the Markov–Dubins problem. Our numerical experiments demonstrate that, when compared with the traditional implementation of a well-known smooth solver, our new method (using the same well-known solver in its subproblem) can find better quality solutions, i.e., “deeper” local minima, or solutions closer to the global minimum. Moreover, our method seems to be more time efficient, especially when the problem has a large number of constraints.

This is a joint work with C. Y. Kaya (UniSA) and C. J. Price (University of Canterbury, Christchurch, New Zealand)

1 15 16 17 18 19 25