The next Data Science seminar will take place on Wednesday 16th November, at 15:00 in Room 038 building FST01 (ΘΕΕ01). The speaker will be Professor Yannis Yatracos (Yau Mathematical Sciences Center, Tsinghua University, Beijing) and the title of his talk will be “Limitations of the Wasserstein MDE for Univariate data“.
Abstract: Minimum Kolmogorov and Wasserstein distance estimates, θMKD and θMWD, respectively, of model parameter, θ(∈Θ), are empirically compared, obtained assuming the model is intractable. For the Cauchy and Lognormal models, simulations indicate both estimates have expected values nearly θ, but θMKD has in all repetitions of the experiments smaller SD than θMWD, and θMKD’s relative efficiency with respect to θMWD improves as the sample size, n, increases. The minimum expected Kolmogorov distance estimate, θMEKD, has eventually bias and SD both smaller than the corresponding Wasserstein estimate, θMΕWD, and θMEKD’s relative efficiency improves as n increases. These results hold also for stable models with stability index α = .5 and α = 1.1. For the Uniform and the Normal models, the estimates have similar performance. The disturbing empirical findings for θMWD are due to the unboudedness and non-robustness of the Wasserstein distance and the heavy tails of the underlying univariate models. Theoretical confirmation is provided for stable models with 1 < α < 2, which have finite first moment. Similar results are expected to hold for multivariate heavy tail models. Combined with existing results in the literature, the findings do not support the use of Wasserstein distance in statistical inference, especially for intractable and Black Box models with unverifiable heavy tails.