Abstract

Consider a $p$-times differentiable unknown regression function $\\theta$ of a $d$-dimensional measurement variable. Let $T(\\theta)$ denote a derivative of $\\theta$ of order $m$ and set $r = (p - m)/(2p + d)$. Let $\\hat{T}_n$ denote an estimator of $T(\\theta)$ based on a training sample of size $n$, and let $\\| \\hat{T}_n - T(\\theta)\\|_q$ be the usual $L^q$ norm of the restriction of $\\hat{T}_n - T(\\theta)$ to a fixed compact set. Under appropriate regularity conditions, it is shown that the optimal rate of convergence for $\\| \\hat{T}_n - T(\\theta)\\|_q$ is $n^{-r}$ if $0 < q < \\infty$; while $(n^{-1} \\log n)^r$ is the optimal rate if $q = \\infty$.

Keywords

MathematicsDifferentiable functionCombinatoricsEstimatorRate of convergenceNonparametric regressionOrder (exchange)Regression functionMathematical analysisStatistics

Related Publications

Multivariate Smoothing Spline Functions

Given data $z_i = g(t_i ) + \varepsilon _i , 1 \leqq i \leqq n$, where g is the unknown function, the $t_i $ are known d-dimensional variables in a domain $\Omega $, and the $\v...

1984 SIAM Journal on Numerical Analysis 109 citations

Publication Info

Year
1982
Type
article
Volume
10
Issue
4
Citations
1477
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

1477
OpenAlex

Cite This

Charles J. Stone (1982). Optimal Global Rates of Convergence for Nonparametric Regression. The Annals of Statistics , 10 (4) . https://doi.org/10.1214/aos/1176345969

Identifiers

DOI
10.1214/aos/1176345969