Martedì 10 dicembre 2019, alle ore 16:00, presso l'aula Beltrami del
Dipartimento di Matematica "F. Casorati",

Pierre Yves Bouchet, Ecole Polytechnique di Montreal

terrà un seminario dal titolo:


nell'ambito del Seminario di Matematica Applicata (IMATI-CNR e
Dipartimento di Matematica, Pavia),


Abstract. In adaptive precision optimization, the objective function
can only be computed with some stochastic errors with controllable
standard deviation. A common strategy is to monotonically diminish
this standard deviation during the optimization process, making it
asymptotically converge to zero to ensure convergence of algorithms
because the noise is dismantled. The algorithm MpMads presented in
this work follows this approach. However, a second algorithm called
DpMads is introduced to explore another strategy which does not force
the standard deviation to monotonically diminish. Although these
strategies are proved to be theoretically equivalent, some tests shows
practical differences. Derivative-free optimization framework (no
assumption is used about the derivability of the objective function)
is used, as MpMads and DpMads generalise the deterministic Mads
algorithm designed for such a framework.
This is a joint work with Prof. Charles Audet and Sébastien Le Digabel
from Polytechnique Montréal, and Stéphane Alarie from Hydro Québec
research centre.