Doubly-robust inference and optimality in structure-agnostic models with smoothness
作者:
时间:2024-06-07
阅读量:179次
  • 演讲人: Matteo Bonvini(Rutgers University)
  • 时间:2024年7月2日10:00(北京时间)
  • 地点:浙江大学紫金港校区行政楼1417报告厅
  • 主办单位:浙江大学数据科学研究中心

Abstract: We study the problem of constructing an estimator of the average treatment effect (ATE) that exhibits doubly-robust asymptotic linearity (DRAL). This is a stronger requirement than doubly-robust consistency. A DRAL estimator can yield asymptotically valid Wald-type confidence intervals even when the propensity score or the outcome model is inconsistently estimated. On the contrary, the celebrated doubly-robust, augmented-IPW (AIPW) estimator generally requires consistent estimation of both nuisance functions for standard root-n inference. We make three main contributions. First, we propose a new hybrid class of distributions that consists of the structure-agnostic class introduced in Balakrishnan et al (2023) with additional smoothness constraints. While DRAL is generally not possible in the pure structure-agnostic class, we show that it can be attained in the new hybrid one. Second, we calculate minimax lower bounds for estimating the ATE in the new class, as well as in the pure structure-agnostic one. Third, building upon the literature on doubly-robust inference (van der Laan, 2014, Benkeser et al, 2017, Dukes et al 2021), we propose a new estimator of the ATE that enjoys DRAL. Under certain conditions, we show that its rate of convergence in the new class can be much faster than that achieved by the AIPW estimator and, in particular, matches the minimax lower bound rate, thereby establishing its optimality. Finally, we clarify the connection between DRAL estimators and those based on higher-order influence functions (Robins et al, 2017) and complement our theoretical findings with simulations. 


Bio: Matteo Bonvini is an Assistant Professor in the Department of Statistics at Rutgers, the State University of New Jersey. He obtained a Ph.D. in Statistics from Carnegie Mellon University in 2023, advised by Professor Edward H. Kennedy. His thesis was awarded the department's Umesh K. Gavaskar Best Dissertation Award. Before starting the Ph.D., he worked as Analyst at Cornerstone Research and graduated with B.A. in Statistics from Harvard University in 2016. His current research focuses on developing methodology at the intersection of nonparametric statistics and causal inference.