SEARCH Advanced Search Topic All Topics Algebra Applied Mathematics Calculus and Analysis Chemistry Computer Science Courseware Differential Equations Discrete Mathematics Earth Sciences Economics and Finance Engineering Geometry Graphics Life Sciences Modeling and Simulation Number Theory Physics Probability and Statistics Programming Recreational Social Sciences Tutorial and Reference Language All Languages English Bulgarian Catalan Chinese Dutch Finnish French German Greek Hungarian Italian Japanese Korean Lithuanian Norwegian Polish Portuguese Russian Spanish Swedish
 BROWSE TOPICS Algebra» Applied Mathematics» Calculus and Analysis» Chemistry» Computer Science» Courseware» Differential Equations» Discrete Mathematics» Earth Sciences» Economics and Finance» Engineering» Geometry» Graphics» Life Sciences» Modeling and Simulation» Number Theory» Physics» Probability and Statistics» Programming» Recreational» Social Sciences» Tutorial and Reference»
Information Processing Volume II: The Maximum Entropy Principle
by David J. Blower
• Publisher: Third Millennium Inferencing
• Year: 2013
• ISBN: 9781482359510 (Paperback)
• 581 pp
Description
How does an Information Processor assign legitimate numerical values to probabilities? One very powerful method to achieve this goal is through the Maximum Entropy Principle. Let a model insert information into a probability distribution by specifying constraint functions and their averages. Then, maximize the amount of missing information that remains after taking this step. The quantitative measure of the amount of missing information is Shannon's information entropy. Examples are given showing how the Maximum Entropy Principle assigns numerical values to the probabilities in coin tossing, dice rolling, statistical mechanics , and other inferential scenarios. The Maximum Entropy Principle also eliminates the mystery as to the origin of the mathematical expressions underlying all probability distributions. The MEP derivation for the Gaussian and generalized Cauchy distributions is shown in detail. The MEP is also related to Fisher information and the Kullback-Leibler measure of relative entropy. The initial examples shown are a prelude to a more in-depth discussion of Information Geometry. Related Topics
Applied Mathematics