International
Tables for
Crystallography
Volume F
Crystallography of biological macromolecules
Edited by M. G. Rossmann and E. Arnold

International Tables for Crystallography (2006). Vol. F. ch. 16.2, pp. 346-347   | 1 | 2 |

Section 16.2.2.3. Jaynes' maximum-entropy principle

G. Bricognea*

aLaboratory of Molecular Biology, Medical Research Council, Cambridge CB2 2QH, England
Correspondence e-mail: gb10@mrc-lmb.cam.ac.uk

16.2.2.3. Jaynes' maximum-entropy principle

| top | pdf |

From the fundamental theorems just stated, which may be recognized as Gibbs' argument in a different guise, Jaynes' own maximum-entropy argument proceeds with striking lucidity and constructive simplicity, along the following lines:

  • (1) experimental observation of, or `data acquisition' on, a given system enables us to progress from an initial state of uncertainty to a state of lesser uncertainty about that system;

  • (2) uncertainty reflects the existence of numerous possibilities of accounting for the available data, viewed as constraints, in terms of a physical model of the internal degrees of freedom of the system;

  • (3) new data, viewed as new constraints, reduce the range of these possibilities;

  • (4) conversely, any step in our treatment of the data that would further reduce that range of possibilities amounts to applying extra constraints (even if we do not know what they are) which are not warranted by the available data;

  • (5) hence Jaynes's rule: `The probability assignment over the range of possibilities [i.e. our picture of residual uncertainty] shall be the one with maximum entropy consistent with the available data, so as to remain maximally non-committal with respect to the missing data'.

The only requirement for this analysis to be applicable is that the `ranges of possibilities' to which it refers should be representable (or well approximated) by ensembles of abstract messages emanating from a random source. The entropy to be maximized is then the entropy per symbol of that source.

The final form of the maximum-entropy criterion is thus that q(s) should be chosen so as to maximize, under the constraints expressing the knowledge of newly acquired data, its entropy [{\cal S}_{m}(q) = - \textstyle\int\limits_{V}\displaystyle q({\bf s}) \log [q({\bf s})/m({\bf s})] \; \hbox{d}\mu ({\bf s}) \eqno(16.2.2.4)] relative to the `prior prejudice' m(s) which maximizes H in the absence of these data.








































to end of page
to top of page