Abstract
Summary In the analysis of data it is often assumed that observations y 1, y 2, …, yn are independently normally distributed with constant variance and with expectations specified by a model linear in a set of parameters θ. In this paper we make the less restrictive assumption that such a normal, homoscedastic, linear model is appropriate after some suitable transformation has been applied to the y's. Inferences about the transformation and about the parameters of the linear model are made by computing the likelihood function and the relevant posterior distribution. The contributions of normality, homoscedasticity and additivity to the transformation are separated. The relation of the present methods to earlier procedures for finding transformations is discussed. The methods are illustrated with examples.
Keywords
Affiliated Institutions
Related Publications
Generalized Collinearity Diagnostics
Abstract Working in the context of the linear model y = Xβ + ε, we generalize the concept of variance inflation as a measure of collinearity to a subset of parameters in β (deno...
A Linear Spatial Correlation Model, with Applications to Positron Emission Tomography
Abstract A simple spatial-correlation model is presented for repeated measures data. Correlation between observations on the same subject is assumed to decay as a linear functio...
Autoregressive Conditional Heteroscedasticity with Estimates of the Variance of United Kingdom Inflation
Traditional econometric models assume a constant one-period forecast variance. To generalize this implausible assumption, a new class of stochastic processes called autoregressi...
Role of range and precision of the independent variable in regression of data
Abstract Regression of the experimental data of one independent variable, y vs . a linear combination of functions of an independent variable of the form y = Σβ j f j (x) is con...
Cross-Validatory Estimation of the Number of Components in Factor and Principal Components Models
By means of factor analysis (FA) or principal components analysis (PCA) a matrix Y with the elements y ik is approximated by the model Here the parameters α, β and θ express the...
Publication Info
- Year
- 1964
- Type
- article
- Volume
- 26
- Issue
- 2
- Pages
- 211-243
- Citations
- 14698
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1111/j.2517-6161.1964.tb00553.x