Abstract

Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available.It is concluded that statistical mechanics need not be regarded as a physical theory dependent for its validity on the truth of additional assumptions not contained in the laws of mechanics (such as ergodicity, metric transitivity, equal a priori probabilities, etc.). Furthermore, it is possible to maintain a sharp distinction between its physical and statistical aspects. The former consists only of the correct enumeration of the states of a system and their properties; the latter is a straightforward example of statistical inference.

Keywords

Statistical mechanicsStatistical inferencePrinciple of maximum entropyStatistical theoryEntropy (arrow of time)InferenceInformation theoryMathematicsStatistical hypothesis testingPhysical lawStatistical physicsErgodicityComputer scienceStatisticsPhysicsArtificial intelligenceQuantum mechanics

Affiliated Institutions

Related Publications

Black Holes and Entropy

There are a number of similarities between black-hole physics and thermodynamics. Most striking is the similarity in the behaviors of black-hole area and of entropy: Both quanti...

1973 Physical review. D. Particles, fields... 6767 citations

Publication Info

Year
1957
Type
article
Volume
106
Issue
4
Pages
620-630
Citations
12513
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

12513
OpenAlex

Cite This

E. T. Jaynes (1957). Information Theory and Statistical Mechanics. Physical Review , 106 (4) , 620-630. https://doi.org/10.1103/physrev.106.620

Identifiers

DOI
10.1103/physrev.106.620