Abstract

For the problem of estimating a regression function, $\\mu$ say,\nsubject to shape constraints, like monotonicity or convexity, it is argued that\nthe divergence of the maximum likelihood estimator provides a useful measure of\nthe effective dimension of the model. Inequalities are derived for the expected\nmean squared error of the maximum likelihood estimator and the expected\nresidual sum of squares. These generalize equalities from the case of linear\nregression. As an application, it is shown that the maximum likelihood\nestimator of the error variance $\\sigma^2$ is asymptotically normal with mean\n$\\sigma^2$ and variance $2\\sigma_2/n$. For monotone regression, it is shown\nthat the maximum likelihood estimator of $\\mu$ attains the optimal rate of\nconvergence, and a bias correction to the maximum likelihood estimator of\n$\\sigma^2$ is derived.

Keywords

MathematicsEstimatorStatisticsMean squared errorBias of an estimatorMinimum-variance unbiased estimatorApplied mathematicsConvexityRestricted maximum likelihoodConsistent estimatorRate of convergenceMonotonic functionM-estimatorEstimation theoryMathematical analysis

Related Publications

Publication Info

Year
2000
Type
article
Volume
28
Issue
4
Citations
154
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

154
OpenAlex

Cite This

Mary C. Meyer, Michael Woodroofe (2000). On the degrees of freedom in shape-restricted regression. The Annals of Statistics , 28 (4) . https://doi.org/10.1214/aos/1015956708

Identifiers

DOI
10.1214/aos/1015956708