A farewell to entropy : statistical thermodynamics based on information : S=logW /

Saved in:
Bibliographic Details
Author / Creator:Ben-Naim, Arieh, 1934-
Imprint:Hackensack, N.J. : World Scientific, c2008.
Description:xxv, 384 p. : ill. (some col.) ; 23 cm.
Language:English
Subject:Entropy.
Second law of thermodynamics.
Statistical thermodynamics.
Entropy.
Information theory in physics.
Second law of thermodynamics.
Statistical thermodynamics.
Format: Print Book
URL for this record:http://pi.lib.uchicago.edu/1001/cat/bib/7198590
Hidden Bibliographic Details
Varying Form of Title:S=logW
ISBN:9812707069
9789812707062
9812707077 (pbk.)
9789812707079 (pbk.)
Notes:Includes bibliographical references (p. 373-379) and index.
Table of Contents:
  • List of Abbreviations
  • 2.2.1. The sample space, denoted [Omega]
  • Index
  • 2.2.2. The field of events, denoted F
  • 2.2.3. The probability function, denoted P
  • 2.3. The Classical Definition
  • 2.4. The Relative Frequency Definition
  • 2.5. Independent Events and Conditional Probability
  • 2.5.1. Conditional probability and subjective probability
  • 2.5.2. Conditional probability and cause and effect
  • 2.5.3. Conditional probability and probability of joint events
  • 2.6. Bayes' Theorem
  • Preface
  • 2.6.1. A challenging problem
  • 2.6.2. A more challenging problem: The three prisoners' problem
  • 2.7. Random Variables, Average, Variance and Correlation
  • 2.8. Some Specific Distributions
  • 2.8.1. The binomial distribution
  • 2.8.2. The normal distribution
  • 2.8.3. The Poisson distribution
  • 2.9. Generating Functions
  • 2.10. The Law of Large Numbers
  • 3. Elements of Information Theory
  • 1. Introduction
  • 3.1. A Qualitative Introduction to Information Theory
  • 3.2. Definition of Shannon's Information and Its Properties
  • 3.2.1. Properties of the function H for the simplest case of two outcomes
  • 3.2.2. Properties of H for the general case of n outcomes
  • 3.2.3. The consistency property of the missing information (MI)
  • 3.2.4. The case of an infinite number of outcomes
  • 3.3. The Various Interpretations of the Quantity H
  • 3.4. The Assignment of Probabilities by the Maximum Uncertainty Principle
  • 3.5. The Missing Information and the Average Number of Binary Questions Needed to Acquire It
  • 3.6. The False Positive Problem, Revisited
  • 1.1. A Brief History of Temperature and Entropy
  • 3.7. The Urn Problem, Revisited
  • 4. Transition from the General MI to the Thermodynamic MI
  • 4.1. MI in Binding Systems: One Kind of Information
  • 4.1.1. One ligand on M sites
  • 4.1.2. Two different ligands on M sites
  • 4.1.3. Two identical ligands on M sites
  • 4.1.4. Generalization to N ligands on M sites
  • 4.2. Some Simple Processes in Binding Systems
  • 4.2.1. The analog of the expansion process
  • 4.2.2. A pure deassimilation process
  • 1.2. The Association of Entropy with Disorder
  • 4.2.3. Mixing process in a binding system
  • 4.2.4. The dependence of MI on the characterization of the system
  • 4.3. MI in an Ideal Gas System: Two Kinds of Information. The Sackur-Tetrode Equation
  • 4.3.1. The locational MI
  • 4.3.2. The momentum MI
  • 4.3.3. Combining the locational and the momentum MI
  • 4.4. Comments
  • 5. The Structure of the Foundations of Statistical Thermodynamics
  • 5.1. The Isolated System; The Micro-Canonical Ensemble
  • 5.2. System in a Constant Temperature; The Canonical Ensemble
  • 1.3. The Association of Entropy with Missing Information
  • 5.3. The Classical Analog of the Canonical Partition Function
  • 5.4. The Re-interpretation of the Sackur-Tetrode Expression from Informational Considerations
  • 5.5. Identifying the Parameter [beta] for an Ideal Gas
  • 5.6. Systems at Constant Temperature and Chemical Potential; The Grand Canonical Ensemble
  • 5.7. Systems at Constant Temperature and Pressure; The Isothermal Isobaric Ensemble
  • 5.8. The Mutual Information due to Intermolecular Interactions
  • 6. Some Simple Applications
  • 6.1. Expansion of an Ideal Gas
  • 6.2. Pure, Reversible Mixing; The First Illusion
  • 6.3. Pure Assimilation Process; The Second Illusion
  • 2. Elements of Probability Theory
  • 6.3.1. Fermi-Dirac (FD) statistics; Fermions
  • 6.3.2. Bose-Einstein (BE) statistics; Bosons
  • 6.3.3. Maxwell-Boltzmann (MB) statistics
  • 6.4. Irreversible Process of Mixing Coupled with Expansion
  • 6.5. Irreversible Process of Demixing Coupled with Expansion
  • 6.6. Reversible Assimilation Coupled with Expansion
  • 6.7. Reflections on the Processes of Mixing and Assimilation
  • 6.8. A Pure Spontaneous Deassimilation Process
  • 6.9. A Process Involving only Change in the Momentum Distribution
  • 6.10. A Process Involving Change in the Intermolecular Interaction Energy
  • 2.1. Introduction
  • 6.11. Some Baffling Experiments
  • 6.12. The Second Law of Thermodynamics
  • Appendices
  • A. Newton's binomial theorem and some useful identities involving binomial coefficients
  • B. The total number of states in the Fermi-Dirac and the Bose-Einstein statistics
  • C. Pair and triplet independence between events
  • D. Proof of the inequality [vertical bar]R(X, Y)[vertical bar] [less than or equal] 1 for the correlation coefficient
  • E. The Stirling approximation
  • F. Proof of the form of the function H
  • G. The method of Lagrange undetermined multipliers
  • 2.2. The Axiomatic Approach
  • H. Some inequalities for concave functions
  • I. The MI for the continuous case
  • J. Identical and indistinguishable (ID) particles
  • K. The equivalence of the Boltzmann's and Jaynes' procedures to obtain the fundamental distribution of the canonical ensemble
  • L. An alternative derivation of the Sackur-Tetrode equation
  • M. Labeling and un-labeling of particles
  • N. Replacing a sum by its maximal term
  • O. The Gibbs paradox (GP)
  • P. The solution to the three prisoners' problem
  • References