International
Tables for
Crystallography
Volume F
Crystallography of biological macromolecules
Edited by M. G. Rossmann and E. Arnold

International Tables for Crystallography (2006). Vol. F, ch. 16.2, p. 346   | 1 | 2 |

Section 16.2.1. Introduction

G. Bricognea*

aLaboratory of Molecular Biology, Medical Research Council, Cambridge CB2 2QH, England
Correspondence e-mail: gb10@mrc-lmb.cam.ac.uk

16.2.1. Introduction

| top | pdf |

The modern concept of entropy originated in the field of statistical thermodynamics, in connection with the study of large material systems in which the number of internal degrees of freedom is much greater than the number of externally controllable degrees of freedom. This concept played a central role in the process of building a quantitative picture of the multiplicity of microscopic states compatible with given macroscopic constraints, as a measure of how much remains unknown about the detailed fine structure of a system when only macroscopic quantities attached to that system are known. The collection of all such microscopic states was introduced by Gibbs under the name `ensemble', and he deduced his entire formalism for statistical mechanics from the single premise that the equilibrium picture of a material system under given macroscopic constraints is dominated by that configuration which can be realized with the greatest combinatorial multiplicity (i.e. which has maximum entropy) while obeying these constraints.

The notions of ensemble and the central role of entropy remained confined to statistical mechanics for some time, then were adopted in new fields in the late 1940s. Norbert Wiener studied Brownian motion, and subsequently time series of random events, by similar methods, considering in the latter an ensemble of messages, i.e. `a repertory of possible messages, and over that repertory a measure determining the probability of these messages' (Wiener, 1949[link]). At about the same time, Shannon created information theory and formulated his fundamental theorem relating the entropy of a source of random symbols to the capacity of the channel required to transmit the ensemble of messages generated by that source with an arbitrarily small error rate (Shannon & Weaver, 1949[link]). Finally, Jaynes (1957[link], 1968[link], 1983[link]) realized that the scope of the principle of maximum entropy could be extended far beyond the confines of statistical mechanics or communications engineering, and could provide the basis for a general theory (and philosophy) of statistical inference and `data processing'.

The relevance of Jaynes' ideas to probabilistic direct methods was investigated by the author (Bricogne, 1984[link]). It was shown that there is an intimate connection between the maximum-entropy method and an enhancement of the probabilistic techniques of conventional direct methods known as the `saddlepoint method', some aspects of which have already been dealt with in Section 1.3.4.5.2[link] in Chapter 1.3 of IT B (Bricogne, 2001[link]).

References

Bricogne, G. (1984). Maximum entropy and the foundations of direct methods. Acta Cryst. A40, 410–445.Google Scholar
Bricogne, G. (2001). Fourier transforms in crystallography: theory, algorithms and applications. In International tables for crystallography, Vol. B. Reciprocal space, edited by U. Shmueli, 2nd ed., ch. 1.3. Dordrecht: Kluwer Academic Publishers.Google Scholar
Jaynes, E. T. (1957). Information theory and statistical mechanics. Phys. Rev. 106, 620–630.Google Scholar
Jaynes, E. T. (1968). Prior probabilities. IEEE Trans. SSC, 4, 227–241.Google Scholar
Jaynes, E. T. (1983). Papers on probability, statistics and statistical physics. Dordrecht: Reidel. Google Scholar
Shannon, C. E. & Weaver, W. (1949). The mathematical theory of communication. Urbana: University of Illinois Press.Google Scholar
Wiener, N. (1949). Extrapolation, interpolation and smoothing of stationary time series. Cambridge, MA: MIT Press.Google Scholar








































to end of page
to top of page