Abstract
The regression model $\\mathbf{y} = g(\\mathbf{x}) + \\mathbf{\\varepsilon}$ and least-squares estimation are studied in a general context. By making use of empirical process theory, it is shown that entropy conditions on the class $\\mathscr{G}$ of possible regression functions imply $L^2$-consistency of the least-squares estimator $\\hat{\\mathbf{g}}_n$ of $g$. This result is applied in parametric and nonparametric regression.
Keywords
Related Publications
Estimation in a Multivariate "Errors in Variables" Regression Model: Large Sample Results
In a multivariate "errors in variables" regression model, the unknown mean vectors $\\mathbf{u}_{1i}: p \\times 1, \\mathbf{u}_{2i}: r \\times 1$ of the vector observations $\\m...
Optimal Global Rates of Convergence for Nonparametric Regression
Consider a $p$-times differentiable unknown regression function $\\theta$ of a $d$-dimensional measurement variable. Let $T(\\theta)$ denote a derivative of $\\theta$ of order $...
Applied Linear Regression
Preface.1 Scatterplots and Regression.1.1 Scatterplots.1.2 Mean Functions.1.3 Variance Functions.1.4 Summary Graph.1.5 Tools for Looking at Scatterplots.1.5.1 Size.1.5.2 Transfo...
A Comparison of Least Squares and Latent Root Regression Estimators
Miilticollinesrity among the columns of regressor variables is known to cause severe distortion of the least squares estimates of the parameters in a multiple linear regression ...
Gaussian model selection
Our purpose in this paper is to provide a general approach to model selection via penalization for Gaussian regression and to develop our point of view about this subject. The a...
Publication Info
- Year
- 1987
- Type
- article
- Volume
- 15
- Issue
- 2
- Citations
- 45
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1214/aos/1176350362