Machine Learning for Extremes: Michaël Allouche
From Belle Taylor
views
comments
From Belle Taylor
This talk has been automatically captioned. You can remove these by pressing CC on the video toolbar.
Name: Michaël Allouche
Talk Title: On the approximation of extreme quantiles with ReLU neural networks
Abstract: Feedforward neural networks based on Rectified Linear Units (ReLU) cannot efficiently approximate quantile functions which are not bounded in the Fréchet Maximum Domain of Attraction. We thus propose a new parametrization for the generator of a Generative Adversarial Network (GAN) adapted to this framework of heavy-tailed distributions. We provide an analysis of the uniform error between the extreme quantile and its GAN approximation. It appears that the rate of convergence of the error is mainly driven by the second-order parameter of the data distribution. The above results are illustrated on simulated data and real financial data.
This is joint work with Stéphane Girard (Inria) and Emmanuel Gobet (Ecole Polytechnique).
This talk is a contributed talk at EVA 2021.
The University of Edinburgh is a charitable body, registered in Scotland, with registration number SC005336, VAT Registration Number GB 592 9507 00, and is acknowledged by the UK authorities as a “Recognised body” which has been granted degree awarding powers.
Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.
Unless explicitly stated otherwise, all material is copyright © The University of Edinburgh 2021 and may only be used in accordance with the terms of the licence.