Kato, M. (2024). Debiased Regression for Root-N-Consistent Conditional Mean Estimation (preprint). arXiv:2411.11748v1 [stat.ML]
This paper aims to develop a debiased estimator for regression functions that achieves √n-consistency and asymptotic normality, even in high-dimensional and nonparametric settings where traditional estimators struggle to achieve these properties.
The authors propose a debiasing technique that adds a bias-correction term to an initial regression estimator. This bias-correction term estimates the conditional expected residual of the original estimator, effectively adjusting it towards a more accurate estimate. The paper explores using kernel regression and series regression for estimating the conditional expected residual. The theoretical analysis leverages semiparametric theory, specifically the concept of efficient influence functions and techniques like the Donsker condition and sample splitting to control empirical processes.
The proposed debiased estimator achieves √n-consistency and asymptotic normality under mild convergence rate conditions for both the original estimator and the conditional expected residual estimator. The estimator also exhibits double robustness, meaning it remains consistent even if only one of the two estimators (original or bias-correction) is consistent. The paper demonstrates that the debiased estimator achieves semiparametric efficiency, meaning its asymptotic variance matches the theoretical lower bound.
The proposed debiasing method offers a powerful tool for improving the accuracy and statistical inference capabilities of regression estimators, particularly in high-dimensional and nonparametric settings. The √n-consistency allows for more reliable confidence interval construction and hypothesis testing compared to traditional nonparametric methods.
This research significantly contributes to the field of statistical learning by providing a practical and theoretically sound method for obtaining √n-consistent estimators in challenging regression scenarios. This has important implications for various applications, including causal inference and regression discontinuity designs.
The paper primarily focuses on nonparametric regression for clarity, although the method is applicable to high-dimensional settings. Future research could explore specific implementations and applications of the debiasing technique in high-dimensional regression problems. Additionally, investigating the performance of different methods for estimating the conditional expected residual (e.g., series regression, random forests) could further enhance the estimator's practical utility.
Іншою мовою
із вихідного контенту
arxiv.org
Ключові висновки, отримані з
by Masahiro Kat... о arxiv.org 11-19-2024
https://arxiv.org/pdf/2411.11748.pdfГлибші Запити