Tuesday, 20 December 2019, 4 p.m. (sharp),

Espen Sande, Roma Tor Vergata

at the conference room of Dipartimento di Matematica "F. Casorati" - aula Beltrami,in Pavia, will give a lecture titled:


At the end a refreshment will be organized.


Abstract. In adaptive precision optimization, the objective function
can only be computed with some stochastic errors with controllable
standard deviation. A common strategy is to monotonically diminish
this standard deviation during the optimization process, making it
asymptotically converge to zero to ensure convergence of algorithms
because the noise is dismantled. The algorithm MpMads presented in
this work follows this approach. However, a second algorithm called
DpMads is introduced to explore another strategy which does not force
the standard deviation to monotonically diminish. Although these
strategies are proved to be theoretically equivalent, some tests shows
practical differences. Derivative-free optimization framework (no
assumption is used about the derivability of the objective function)
is used, as MpMads and DpMads generalise the deterministic Mads
algorithm designed for such a framework.
This is a joint work with Prof. Charles Audet and Sébastien Le Digabel
from Polytechnique Montréal, and Stéphane Alarie from Hydro Québec
research centre.