This talk has captions. You can remove these by pressing CC on the video toolbar.
Name: Michael Allouche
Talk Title: On the approximation of extreme quantiles with ReLU neural networks
Abstract: Feedforward neural networks based on Rectified Linear Units (ReLU) cannot efficiently approximate quantile functions which are not bounded in the Fréchet Maximum Domain of Attraction. We thus propose a new parametrization for the generator of a Generative Adversarial Network (GAN) adapted to this framework of heavy-tailed distributions. We provide an analysis of the uniform error between the extreme quantile and its GAN approximation. It appears that the rate of convergence of the error is mainly driven by the second-order parameter of the data distribution. The above results are illustrated on simulated data and real financial data.
This is joint work with Stéphane Girard (Inria) and Emmanuel Gobet (Ecole Polytechnique).
This talk is a contributed talk at EVA 2021. View the programme here.