modify your search
Results for DC.creator="G." AND DC.creator="Bricogne" in section 16.2.2 of volume F |
Jaynes' maximum-entropy formalism
International Tables for Crystallography (2012). Vol. F, Section 16.2.2.4, pp. 434-435 [ doi:10.1107/97809553602060000851 ]
Jaynes' maximum-entropy formalism 16.2.2.4. Jaynes' maximum-entropy formalism Jaynes (1957) solved the problem of explicitly determining such maximum-entropy distributions in the case of general linear constraints, using an analytical apparatus first exploited by Gibbs in statistical mechanics. The maximum-entropy distribution , under the prior prejudice m(s), satisfying the ...
Jaynes' maximum-entropy principle
International Tables for Crystallography (2012). Vol. F, Section 16.2.2.3, pp. 434-433 [ doi:10.1107/97809553602060000851 ]
Jaynes' maximum-entropy principle 16.2.2.3. Jaynes' maximum-entropy principle From the fundamental theorems just stated, which may be recognized as Gibbs' argument in a different guise, Jaynes' own maximum-entropy argument proceeds with striking lucidity and constructive simplicity, along the following lines: (1) experimental observation of, or `data acquisition' on, a ...
The meaning of entropy: Shannon's theorems
International Tables for Crystallography (2012). Vol. F, Section 16.2.2.2, pp. 433-434 [ doi:10.1107/97809553602060000851 ]
The meaning of entropy: Shannon's theorems 16.2.2.2. The meaning of entropy: Shannon's theorems Two important theorems [Shannon & Weaver (1949), Appendix 3] provide a more intuitive grasp of the meaning and importance of entropy: (1) H is approximately the logarithm of the reciprocal probability of a typical long message, divided ...
Sources of random symbols and the notion of source entropy
International Tables for Crystallography (2012). Vol. F, Section 16.2.2.1, p. 433 [ doi:10.1107/97809553602060000851 ]
Sources of random symbols and the notion of source entropy 16.2.2.1. Sources of random symbols and the notion of source entropy Statistical communication theory uses as its basic modelling device a discrete source of random symbols, which at discrete times , randomly emits a `symbol' taken out of a finite alphabet . ...
The maximum-entropy principle in a general context
International Tables for Crystallography (2012). Vol. F, Section 16.2.2, pp. 433-435 [ doi:10.1107/97809553602060000851 ]
The maximum-entropy principle in a general context 16.2.2. The maximum-entropy principle in a general context 16.2.2.1. Sources of random symbols and the notion of source entropy | | Statistical communication theory uses as its basic modelling device a discrete source of random symbols, which at discrete times , randomly emits a `symbol ...
International Tables for Crystallography (2012). Vol. F, Section 16.2.2.4, pp. 434-435 [ doi:10.1107/97809553602060000851 ]
Jaynes' maximum-entropy formalism 16.2.2.4. Jaynes' maximum-entropy formalism Jaynes (1957) solved the problem of explicitly determining such maximum-entropy distributions in the case of general linear constraints, using an analytical apparatus first exploited by Gibbs in statistical mechanics. The maximum-entropy distribution , under the prior prejudice m(s), satisfying the ...
Jaynes' maximum-entropy principle
International Tables for Crystallography (2012). Vol. F, Section 16.2.2.3, pp. 434-433 [ doi:10.1107/97809553602060000851 ]
Jaynes' maximum-entropy principle 16.2.2.3. Jaynes' maximum-entropy principle From the fundamental theorems just stated, which may be recognized as Gibbs' argument in a different guise, Jaynes' own maximum-entropy argument proceeds with striking lucidity and constructive simplicity, along the following lines: (1) experimental observation of, or `data acquisition' on, a ...
The meaning of entropy: Shannon's theorems
International Tables for Crystallography (2012). Vol. F, Section 16.2.2.2, pp. 433-434 [ doi:10.1107/97809553602060000851 ]
The meaning of entropy: Shannon's theorems 16.2.2.2. The meaning of entropy: Shannon's theorems Two important theorems [Shannon & Weaver (1949), Appendix 3] provide a more intuitive grasp of the meaning and importance of entropy: (1) H is approximately the logarithm of the reciprocal probability of a typical long message, divided ...
Sources of random symbols and the notion of source entropy
International Tables for Crystallography (2012). Vol. F, Section 16.2.2.1, p. 433 [ doi:10.1107/97809553602060000851 ]
Sources of random symbols and the notion of source entropy 16.2.2.1. Sources of random symbols and the notion of source entropy Statistical communication theory uses as its basic modelling device a discrete source of random symbols, which at discrete times , randomly emits a `symbol' taken out of a finite alphabet . ...
The maximum-entropy principle in a general context
International Tables for Crystallography (2012). Vol. F, Section 16.2.2, pp. 433-435 [ doi:10.1107/97809553602060000851 ]
The maximum-entropy principle in a general context 16.2.2. The maximum-entropy principle in a general context 16.2.2.1. Sources of random symbols and the notion of source entropy | | Statistical communication theory uses as its basic modelling device a discrete source of random symbols, which at discrete times , randomly emits a `symbol ...
powered by |