Title:

Challenges and Opportunities in Assumption-free and Robust Inference

Abstract: 

With the growing application of data science to complex high-stakes tasks, ensuring the reliability of statistical inference methods has become increasingly critical. This talk considers two key challenges to achieving this goal: model misspecification and data corruption, highlighting their associated difficulties and potential solutions. In the first part, we investigate the problem of distribution-free algorithm risk evaluation, uncovering fundamental limitations for answering these questions with limited amounts of data. To navigate the challenge, we will also discuss how incorporating an assumption about algorithmic stability might help. The second part focuses on constructing robust confidence intervals in the presence of arbitrary data contamination. We show that when the proportion of contamination is unknown, uncertainty quantification incurs a substantial cost, resulting in optimal robust confidence intervals that must be significantly wider.

Short bio:

Yuetian Luo is a postdoctoral scholar in the Data Science Institute at the University of Chicago advised by Professor Rina Foygel Barber. He received his Ph.D. in Statistics from the University of Wisconsin-Madison under the supervision of Professor Anru Zhang. His research interests lie broadly in distribution-free inference, computational complexity of statistical inference, tensor learning, robust statistics, and non-convex optimization.