Abstract
In many cases an optimum or computationally convenient test of a simple hypothesis $H_0$ against a simple alternative $H_1$ may be given in the following form. Reject $H_0$ if $S_n = \\sum^n_{j=1} X_j \\leqq k,$ where $X_1, X_2, \\cdots, X_n$ are $n$ independent observations of a chance variable $X$ whose distribution depends on the true hypothesis and where $k$ is some appropriate number. In particular the likelihood ratio test for fixed sample size can be reduced to this form. It is shown that with each test of the above form there is associated an index $\\rho$. If $\\rho_1$ and $\\rho_2$ are the indices corresponding to two alternative tests $e = \\log \\rho_1/\\log \\rho_2$ measures the relative efficiency of these tests in the following sense. For large samples, a sample of size $n$ with the first test will give about the same probabilities of error as a sample of size $en$ with the second test. To obtain the above result, use is made of the fact that $P(S_n \\leqq na)$ behaves roughly like $m^n$ where $m$ is the minimum value assumed by the moment generating function of $X - a$. It is shown that if $H_0$ and $H_1$ specify probability distributions of $X$ which are very close to each other, one may approximate $\\rho$ by assuming that $X$ is normally distributed.
Keywords
Related Publications
The Probability Function of the Product of Two Normally Distributed Variables
Let $x$ and $y$ follow a normal bivariate probability function with means $\\bar X, \\bar Y$, standard deviations $\\sigma_1, \\sigma_2$, respectively, $r$ the coefficient of co...
Robust Estimation of a Location Parameter
This paper contains a new approach toward a theory of robust estimation; it treats in detail the asymptotic theory of estimating a location parameter for contaminated normal dis...
On a Test of Whether one of Two Random Variables is Stochastically Larger than the Other
Let $x$ and $y$ be two random variables with continuous cumulative distribution functions $f$ and $g$. A statistic $U$ depending on the relative ranks of the $x$'s and $y$'s is ...
Modified Randomization Tests for Nonparametric Hypotheses
Suppose $X_1, \\cdots, X_m, Y_1, \\cdots, Y_n$ are $m + n = N$ independent random variables, the $X$'s identically distributed and the $Y$'s identically distributed, each with a...
Limit of the Smallest Eigenvalue of a Large Dimensional Sample Covariance Matrix
In this paper, the authors show that the smallest (if $p \\leq n$) or the $(p - n + 1)$-th smallest (if $p > n$) eigenvalue of a sample covariance matrix of the form $(1/n)XX...
Publication Info
- Year
- 1952
- Type
- article
- Volume
- 23
- Issue
- 4
- Pages
- 493-507
- Citations
- 3556
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1214/aoms/1177729330