Uniform Convergence Rate of the Kernel Regression Estimator Adaptive to Intrinsic Dimension in Presence of Censored Data

Published in Journal of Nonparametric Statistics, 2020

S. Bouzebda and T. El-hadjali

The focus of the present paper is on the uniform in bandwidth consistency of kernel-type estimators of the regression function \(\mathbb{E}(\Psi(\mathbf{Y})\mid \mathbf{X}=\mathbf{x})\) derived by modern empirical process theory, under weaker conditions on the kernel than previously used in the literature. Our theorems allow data-driven local bandwidths for these statistics. We extend existing uniform bounds on kernel regression estimator and making it adaptive to the intrinsic dimension of the underlying distribution of \(\mathbf{X}\) which will be characterizing by the so-called intrinsic dimension. Moreover, we show, in the same context, the uniform in bandwidth consistency for nonparametric inverse probability of censoring weighted (I.P.C.W.) estimators of the regression function under random censorship. Statistical applications to the kernel-type estimators (density, regression, conditional distribution, derivative functions, entropy, mode and additive models) are given to motivate these results.

Download paper here