Machine Learning for extremes: Chen Zhou
From Belle Taylor
This talk has been automatically captioned. You can remove these by pressing CC on the video toolbar.
Name: Chen Zhou
Talk Title: Distributed Inference for Tail Empirical and Quantile Processes
Abstract: The availability of massive data sets allows for conducting extreme value statistics using more observations drawn from the tail of an underlying distribution. However, if such data sets are saved in multiple machines and cannot be combined into one oracle sample due to privacy reasons, it poses computational challenges for computing the estimate using an oracle estimator based on the oracle sample. Such a situation is regarded as the distributed inference setup. To overcome this problem, distributed inference often considers a divide-and-conquer (DC) algorithm: first compute the estimate using observations on each machine, then transmit the estimates from all machines to the central machine, and eventually take the average of the estimates on the central machine. If the final distributed estimator possesses the same asymptotic behavior as the hypothetical oracle estimator based on the oracle sample, then it is regarded as satisfying the oracle property.
Estimators in extreme value statistics do not satisfy the oracle property per se. In this paper, we introduce a set of tools regarding the asymptotic behavior of the tail empirical and quantile processes under the distributed inference setup. Using these tools, one may establish the oracle property for most extreme value estimators under proper sufficient conditions. As an example, we show the oracle property for the probability weighted moment estimator.
This talk is a contributed talk at EVA 2021.