Vous êtes sur la page 1sur 661
Figure Acknowledgements Ihe following figures are gratefully used with permis 226, 2288, 289, 29.3, 294, 29.60, 29.60, 29.9, 29.10, 29.16 and 324 American Association for the Advance- ment of Seience. 10.5, 11-14, 19.6, 22.12, 22.13, 25.10, 25.26, 20.5, 28.18a, 28-18b, 32.7 and 32.9 Academic Press, 9.1912, 94, 1.7, 28.1, 301, 25.20, 26.14, 2911, 0.5, SUL ST.IT, 33.7, 83:10 ane 33.12 American Chem: Jeal Society. 25.17 and 11.1 WA Benjamin. $0.6 Amer sean Institute of Physies. 14.8 Butterworth Heinemann, 27a and 11-13 Cambridge University Press. 8.12, 18.8, 25.14, 26.7, 32.4 and 32.12 Oxford University Press, Ine T1168 Cold Spring Haebor University Press, 1.16 Columbia University Press. 4.1 and 31.3 Cornell University. Press. 40.18 Chemical Rubber Company. 9.5 Deutsche Bunser Gesellschaft {ur Physikalische Chemie. 13.3, 10.7, 23.3, and 29,13 Dover Publications. 26.17a and 32.10 Else vier, 189, 18.10, 25.28a, 26.15, 27-6 and 27.16 Springer Verlag, 10.10 and 13.5 MeGraw Hill. 18.5, 22.84, 27.11 and 27.12 MIT Press, 26.17b American Physical Society 14,9 Portland Press, 10.6 and 25.9 Pearson Education, Ine 29.18, 18. and 18.13 Princeton University Press. 25.280 and 29.8 The Royal Society of Chemistry. 28.2 and 28.13 University Science Books. 28.2% and 32.6 WH Freeman. 12, Wicd, DRAG, 19S, 217, 236, 24.8, 25.2, 21, 26.3, 27-4, 27.18, 30.2, 31.2 and 31.8 fobn Wiley & Sons, 19.2 and 19.3 Reinhold Publishing Corp. 9.6, Munks Gaard, 25.4, 26.9, 27.5 and 30.7 National Academy fo Sciences USA, 16.3 and 29.14 Plenum Press. 28.7 and 28.194 American Society for Biochemistry & Molecular Biology ‘Table Acknowledgements The following tables are gratefully used with permission: 18.2, 183 American Chemical Society. 20.2 andl 30.3 Dover Publications. 26.1 Ostord University Press. 11.3, 13.1 and 16,1 Pearson Fducation, 8.2 Prenmtice-tlall. 29.1 ‘The Royal Society of Chemistry. 9.1, 11.2, 11a, 14.2, 18.1 and 25,0 John Wiley & Sons. 11.1 18. WW. Norton & Com pany, Ine. 40.1 and $0.7 National Aeademy of Sciences, USA. 22.8 Sinauer, © 2003 by Ken A, Dill, Sarina Bromberg, Dirk Stigter All rights reserved, No part of this book cov ered by the copyright hereon may’ be reproduced or used in any format in any form or by any means— graphic, electronic, or mechanical, including photo: copying, recording, taping, or information storage and retrieval systems—without permission of the publisher. Library of Congress Cataloging, in-Publication Data Dill, Ken A. Molecular driving forces: statistical thermody namics in chemistry and biology / Ken A. Dill, Sarina Bromberg, p.cm, Includes bibliographical references and index. ISBN 0-8153-205 1. Statistical thermodynamics. L. Bromberg, Sarina. IL. Tithe OCSLLS 1035 2002 540, 7-de21 2001053202 Published by Garland Science, a member of the Taylor & Francis Group 29 West 35! Street, New York, NY 10001-2299, USA. 1 New Fetter Lane, London ECAP 4EE, UK Printed in the United States of America IV 131211 LOOK TOS4321 unvene wa LigRany About the Authors Ken A. Dill is Professor of Pharmaceutical Chemistry and Biophysics at the University of California, San Francisco, He received his undergraduate training at Massachusetts Institute of Technology, his PhD from the University of California, San Diego, and did postdoctoral work at Stanford. researcher in biopolymer statistical mechanies and protein folding, he has been the President of the Biophy sical Society and received the Hans Neurath Award from the Protein Society in L998. Sarina Bromberg received her BFA at the Cooper Union for the Advancement of Science and Art, her PhD in molecular biophysics from Wesleyan University, and her postdoctoral training at the University of California, San Francisco. She writes, edits and illustrates scientific textbooks. Contents Preface Acknowledgements 1 Principles of Probability Principles of Probability «Are the Foundations of Entropy What Is Probability? Rules of Probability Correlated Events/Conditional Probabilities Combinatorics Distribution Functions Wwerages, Standard Deviations Summary. . Problems Suggested Reading Extremum Principles Predict Fqu What Are Extremum Principles? What Is a State of Equilibrium? Maximizing Multiplicity Simple Models, summary Problems. Suggested Reading 3. Heat, Work & Energy Heat Flows to Maximize Entropy Conservation Laws Heat Was Thought to Be a Fluid toms and Molecules Have Energies Why Does Heat Flow? Summary Problems Suggested Reading 4 Math Tools: Series and Approximations Physical Modelling Involves Series Expansions, Making Approximations Involves Truncating Series’ Gaussian Distribution/Random Walk Summary Problems, Suggested Reading 30 sooo 2o399299 x CONTENTS: Multivariate Calculus Functions of Multiple Variables Partial Derivatives: Extrema of Multivariate Functions. Integrating Multivariate Functions: the Chain Rule Rearranging Dependent and Independent Variables Summary Problems Suggested Reading « Entropy & the Boltzmann Distribution Law What Is Entropy? Flat Distributions if there Are No Constraints Exponential Distributions if there Are Constraints Principle of Fair Apportionment Philosophical Foundations, Summary Problems Suggested Reading ‘Thermodynamic Driving Forces Thermodynamics Is Iwo Laws ‘The Fundamental Thermodynamic Equations Delining the Thermodynamic Driving Forces Homogeneous Functions ‘Thermal, Mechanical, and Chemical Equilibria ‘Thermodynamic Logic “The First Lavy Interrelates Heat, Work, and Energy Why Is There an Absolute Temperature Scale? Other Statements of the Second Law Summary Problems Suggested Reading Free Energies Switching from Entropy to Free Energy Free Energy Defines Another Extremum Principle Using the Heat Capacity . Using Thermodynamic Cycles . summary Problems Suggested Reading . BI 132 Me 146 150 151 1 9 Maxwell's Relations & Mixtures Predicting Unmeasurable Quantities Maxwells Relations Interrelate Partial Derivatives Multicomponent Systems/Partial Molar Quantities: Linkage Relations Summary Problems, Suggested Reading 10 Boltzmann Distribution Law Probability Distributions for Atoms and Molecules The Boltzmann Law Describes Equilibria What Does @ Partition Function Tell You? Thermodynamic Properties from Partition Functions What Is an Ensemble? Summary Problems Suggested Reading 11 Statistical Mechanics of Simple Gases and Solids Macroscopic Properties from Atomic Structures Translational Motion Harmonic Oscillator Model Rigid Rotor Model Ideal Gas Properties The Equipartition Theorem Summary Problems, Suggested Reading 12 Temperature, Heat Capacity 4 Microscopic Perspective \ Graphical Procedure, from § 10 What Drives Heat Exchange? The Heat Capacity Retlects Energy Fluctuations Summary Problems: Suggested Reading 13 Chemical Equi Chemical Equilibria from Atomic Structures Le Chatelier’s Principle Temperature Dependence of Equilibrium Summary Problems Suggested Reading CONTENTS x1 14 Equilibria Between Liquids, Solids, and Gases Phase Equilibria The Clapeyron Equation How Do Refrigerators and Heat Pamps Work? Surtace Tension Summary Problems Suggested Reading IIPS 15 Solutions and Mixtures, \ Lattice Model Describes Mixtures Interfacial Tension What Have We Left Out summary Problems Suggested Reading 16 Solvation and Transfers of Molecules Between Phases The Chemical Potential Solvation Activity and Activity Coefficient Boiling Point Flevation Freezing Point Depression Osmotic Pressure Solutes Can Transfer and Partition Dimerization in Solution Summary Problems 17 Vector Calculus Vectors Describe Forces and Flows * Vectors Add and Subtract by Components 5 Vhe Dot Product Sealar and Vector Fields The Flux of a Veetor Field Gauss's Theorem Summary Problems 18 Physical Kinetics Forces Drive Molecules to Flow Linear Laws Relate Forces to Flows The Diffusion Equation . Sources and Sinks: Examples from Population Biology Additional Forces The Einstein-Smoluchowski Equation Brownian Ratchets. The Fluctuation-Dissipation Theorem x CONTENTS Onsager Reciprocal Relations Describe Coupled Flows Summary Problems Suggested Reading 19 Chemical Kinetics & Transition States Rates Depend on Temperature Rates Are Proportional to Concentrations At Equilibrium, Rates Obey Detailed Balance Mass Action Laws Describe Mechanisms: Reaction Rates Depend on Temperature Activated Processes and Transition State Theory Catalysts Speed Up Chemical Re The Bronsted Law Funnel Landscapes and Dillusional Processes Summary Problems Suggested Reading 20 Coulomb's Law Charges and Coulomb's Law Charge Interactions are Long-Ranged Charge Interactions Are Weaker in Media: Diclectri¢ Constants, Electrostatic Forces Add Like Vectors What Is an Electrostatic Field? Electric Fields Have Fluxes Summary Problems Suggested Reading 2 The Electrostatic Potential Electrostatic Potentials with Electrostatic Fields Dipoles Are Separated Charges The Poisson Equation Method of Image Charges Summary Problems Suggested Reading 22 Flectrochemical Equilibria Flectrochemical Potentials in Jonie Solutions The Nernst Equation Voltage-Gated Ton Channels. Acie-Base Equilibria Are Shifted by Electrostatic Ficlds Electrostatic Gradients Cause lon Flows, Creating Charge Distribution Costs Pree Energy’. Summary Problems Suggested Reading 395. 309) 406 407 407 409 - 409 410 47 418 420 423 430 31 432 CONTENTS Nur 23 Salt Lons Shield Charged Objects 433 A Salts Dissociate and Shield Other Charges... 8 Strong and Weak Hectrlytes 40 aN summary 4 pe Problems Ho “ Suggested Reading HF 24 Intermolecular Interactions Short-ranged Repulsions and Long-ranged Attractions Shortranged Attractions Are Electrostatic he van der Waals Gas Model The Lattice Model Contact Energy ‘Summary Problems Suggested Reading 25 Phase Transitions 467 Two States Can Be Stable at the Same Time... . cece ee. 407 Liquids or Solids Mix at High Temperatures sees 168 Phase Separations Are Driven to Lower the Free Energy cera) The Spinodal Curve ‘The Critical Point a ‘The Principles of Boiling 2. 479) Boiling a Liquid Mixture Involves Two Transitions 485 Summary no vee SLIT as? Problems SS Dass Suiested ang : Lan 26 Cooperativity 493 Abrupt Transitions Occur in Nany Different Sst 1 1 | | | | | Transitions and Critical Points Are Universal 493 | The Landau Noel Y . 296 WY ALM | The Ising Model Describes Magnetization II jos y \ The Kinetics of Phase Transitions and Nucleation 2 509) — sama x Problems 32 27 Adsorption, Binding & Catalysis Binding and Adsorption Processes Are Saturable ‘The Langmuir Model Binding and Saturation in Solution ‘The Principle of Adsorption Chromatography Michaelis-Menten Model Sabatier’s Principle for Stabilizing. Transition Summary Problems Suggested Reading ates. xiv CONTENTS 28 Multi-site Cooperative L Binding Polynomials, the Two-site Model of Binding Cooperativily Binding Intermediate States Constructing Binding Polynomials from Rules of Probability ‘Oxygen binding to Hemoglobin Inhibitors. Model of MeGhee and von Hippel Rates Can Often Be Treated hy Using Binding Polynomials, Grand Canonical Ensemble Problems Suggested Reading and Binding, 29 Water Water Is an Unusual Liquid Water Has Hydrogen Bonded Structure Pure Water Has Anomalous Properties summary Problems, Suggested Reading 30 Water as a Solvent Oil and Water Don’t Mix: The Hydrophobic Effect Signature of Hydrophobicity: [ts Temperature Dependence Water Is Structured Near Cavities and Planar Surfaces Alcohols Constriet the Volumes of Aqueous Mixtures ons Can Make or Break Water Structure on Pairing Preferences Summary Problems. Suggested Reacting 3 Polymer Solutions Polymers Are Governed by Statistics Poly mers Have Distributions of Conformations, Polymer Solutions Differ from Small Molecule Solutions The Flory-Huggins Model Nonideal Colligative Properties The Phase Behavior of Polymers Dilution Entropy Drives Solute Partitioning into Polymers. The Flory Theorem Summary Problems, ae Ee CONTENTS xv xv CONTENTS 32 Polymer Elasti Polymeric Materials Are Elastte Random-flight Chains Are Gaussian Polymer Elasticity Follows Hooke's Law Elasticity of Rubbery Materials Polymer Collapse and Expansion Summary Problems Suggested Reading 33 Polymers Resist Confinement & Deformation Excluded Volume Chain Conformations Are Perturbed Near Surfaces. Polymer Conformations by a Diffusion Equation Method Polymers Tend to Avoid Confined Spaces The Rouse-Zimm Model of Polymer Dynamics The Reptation Model Summary Problems. . Suggested Reading Appendix A Table of Constants Appendix B- Table of Units Appendix C_ Useful Taylor Se 's Expansions Appendix D Useful Integrals its, Their Names, and Symbols Appendix E_ Multiples of U Index 646 a7 648 649 Preface What forces drive atoms and molecules to bind, to adsorb, to dissolve, to per- meate membranes, 10 undergo chemical reactions, and to undergo conforma: tional changes? This is a textbook on statistical thermodynamics. It describes the forces that govern molecular behavior. Statistical thermodynamics uses physical models, mathematical approximations, and empirical laws that are rooted in the language of entropy, distribution function, eneray, heat capacity, ree energy, and partition function, to predict the behaviors of molecules in physical, chemical, and biological systems. This text is intended for graduate students and advanced undergraduates in physical chemistry, biochemistry, bioengineering, polymer and materials Sci ence, pharmaceutical chemistry, chemical engineering, and environmental set ence We had three goals in mind as we wrote this book. First, we tried to make ex- tensive connections with experiments and familiar contexts, to show the prac tical importance of this subject. We have included many applications in biol ogy and polymer science, in addition to applications in more traditional areas of chemistry and physics. Second, we tried to make this book accessible to students with a variety of backgrounds. So, for example, we have included material on probabilities, approximations, partial derivatives, vector calculus, and on the historical basis of thermodynamics. Third, we strove to find a van: tage point from which the concepts are revealed in their simplest and most comprehensible forms. For this reason, we follow the axiomatic approach to thermodynamics developed by HB Callen, rather than the more traditional in- ductive approach; and the Maximum Entropy approach of Jaynes, Skilling and Livesay, in preference to the Gibbs ensemble method. We have drawn from many excellent texts, particularly those by Callen, Hill, Atkins, Chandler, Kubo, Kittel and Kroemer, Carrington, Adkins, Weiss, Doi, Flory, and Berry, Rice and Ross. Our focus here is on molecular driving forces, which overlaps with—but is not identical to—the subject of thermodynamics. While the power of ther mody namics is ils generality, the power of statistical thermodynamics is the insights it gives into microscopic interactions through the enterprise of model making. A central theme of this book is that making models, even very simple ones, is a route 10 insight and to understanding how molecules work. A good theory, no matter how complex its mathematics, is usually rooted in some very simple phy sical idea. Models are mental toys to guide our thinking, ‘the most important ingredé ents in @ good model are predictive power and insight into the causes of the predicted behavior. The more rigorous a model, the less room for ambiguity But models don’t need to be complicated to be useful. Many of the key insights in statistical mechanics have come from simplifications that may seem unreal istic at first glance: particles represented as perfect spheres with atomic detail left out, neglecting the presence of other particles, using crystal-like lattices Of particles in liquids and polymers, and modelling polymer chains as random flights, etc. To borrow a quote, statistical thermodynamics has a history of what might be called the unreasonable effectiveness of unrealistic simplifications. Per haps the classic example is the two-dimensional Ising model of magnets as two PREFACE xv XVUIL PREFACE types of arrows, up spins or down spins, on square lattices. Lars Onsager’ famous solution to this highly simplified model was a major contribution to the modern revolution in our understanding of phase transitions and critical phenomena We begin with entropy, Chapter 1 gives the underpinnings in terms of probabilities and combinatorics. Simple models are used in chapters 2 and 3 to show how entropy is a driving lorce, This motivates more detailed treat ments throughout the text illustrating the Second Law of thermodynamics and the concept of equilibrium. Chapters 1, 4, and 5 lay out the mathemat ical foundations— probability, approximations, multivariate caleulus—that arc needed for the following chapters ‘These threads culminate in chapter 6, which defines the entropy and gives the Boltzmann distribution lav, the Iynch-pin of statistical thermodynamics. ‘The key expressions, § = kIni and § = —kS piinp,, are often regarded in physical chemistry Lets as given, but here we provide optional material in which we derive these expressions from a principle of fair apportionment based on treatments by Jaynes, Skilling, Livesay; and others, ‘The principles of thermodynamics are described in chapters 7-9. The sta- listical mechanics of simple systems follows in chapters 10 and 1. While Iomperature aad heat capacity are often regarded as needing no explanation (perhaps because they are so readily measured), our chapter 12 uses simple models to shed! light on the physical basis of those properties. Chapter 13 applies the principles of statistical thermodynamics to chemical equilibria. Chapters 14—16 develop simple models of liquids and solutions. We use lattice models here, rather than ideal solution theories, because such mod- ls give more microscopic insight into real molecules and into the solvation processes that are central to computational chemistry, biology, and materials science, For example, theories of mixtures often begin from the premise that Raoult’s and Henry's kes are experimental facts. Our approach, instead, is to show why molecules are driven to obey these laws. An equally important reason for introducing lattice models here is as background. Lattice models are standard tools for treating comple systems: phase transitions and criti- cal phenomena in chapters 23 and 26, and polymer conformations in chapters 30-33, We explore the dynamic processes of diffusion, transport, and physical and. chemical kinetics in chapters 18 and 19 through the randomflight model, the Langevin model, Onsager relations, time correlation functions and transition, state theory. We treat electrostatics in chapters 20—23. Our treatment is more extensive than in other physical chemistry texts because of the importance, in our view, of electrostatics in understanding the structures of proteins, nucleic acids, mi celles and membranes; for predicting protein- and nucleic acid-ligand inter actions and the behaviors of fon channels; as well as for the classical areas of electrochemistry and colloid science. We develop the Nernst and Poisson: Holtzmann equations and the Born model, modern workhorses of quantitative biology. Chapter 24 describes intermolecular forces. We describe simple models of complex systems, including polymers, col- loids, surfaces, and catalysts. Chapters 25 and 26 focus on cooperativity phase equilibria, solubilities, critical phenomena, and conformational transi tions, described through mean-field theories, the Ising model, heliv-coil model, and Landau theory, Chapters 27 and 28 describe binding polynomials, essen tial to modern pharmaceutical science. Chapters 29 and 30 describe water, the hydrophobic effect, and ion solvation. And chapters 31-~33 focus on the conformations of polymers and biomolecules that give rise to the elasticity of rubber, the viscoclasticities of solutions, the immiscibilities of polymers, rep: tational motion, and the folding of proteins and RNA molecules PREFACE XIX ww ACKNOWLEDGEMENTS Acknowledgements We owe a great debt to Brad Anderson, Victor Bloomtield, Robert Cantor, Hue Sun Chan, John Chodera, Margaret Daugherty, John von Drie, Roland Dunbrack, Burak Erman, Tony Haymet, Peter Kollman, 1) Kuntz, Michael Laskowski, Alenka Luzar, Andy McCammon, Chris Miller, Terry Oas and Fric Toone and their students, Rob Phillips, Miranda Robertson, Kim Sharp, Krista Shipley, Noel Southall, Vojko Vlachy, Peter von Hippel, Hong Qian, Ben \Widom, and Bruno Zimm for very helpful comments on this text. We owe special thanks to Jan WH Sebreurs, Eugene Stanley, and John Sehellman for careful reading and detailed criticism of large parts of various drafts of 1 We are grateful to Richard Shafer and Ron Siegel, who in co-teaching this course have contributed considerable improvements, and to the UCSF graduate students of this course over the past several years, who have helped form this material and correct mistakes. We thank Claudia Johnson, who created the original course notes manuscript—with the good spirits and patience of a saint, We are deeply grateful to the talented people who worked so hard on book production: Danny Heap and Patricia Monohon who keyboarded, coded, extended and converted the manuscript into this book. format; to Jolanda Sehreurs who worked tirelessly on the graphics; and to, Emma Hunt, Matthew Day and Denise Schanek, who patiently and professionally saw us through to bound books. The preparation of this text was partially funded by a grant from the Polymer Education Committee of the Divisions of Polymer Chemistry and Polymeric Materials of the American Chemical Society Principles of Probability The Principles of Probability Are the Foundations of Entropy Fluids flow, boil, freeze, and evaporate. Solids melt and deform, Oil and wa: ter don't mix. Metals and semiconductors conduct electricity. Crystals grow. Chemicals react and rearrange, take up heat and give it off. Rubber stretches. and retracts. Proteins catalyze biological reactions. What forces drive these processes? This question is addressed by statistical thermodynamics, a set of tools for modeling molecular forces and behavior, and a language for interpret ing experiments. The challenge in understanding these behaviors is that the properties that can be measured and controlled, such as density, temperature, pressure, heat capacity, molecular radius, or equilibrium constants, do not predict the tenden: cies and equilibria of systems in a simple and direct way. To predict equilibria, we must step into a different world, where we use the language of energy, en tropy, enthalpy, and free energy. Measuring the density of liquid water just below its boiling temperature dovs not hint at the surprise that just a few de xgtees higher, above the boiling temperature, the density suddenly drops more than a thousandfold. To predict density changes and other measurable prop- erties, you need to know about the driving forces, the entropies and energies. We begin with entropy Entropy is one of the most fundamental concepts in statistical thermody namics. It describes the tendency of matter toward disorder. The concepts that Cusprer 1 PRINCIPLES OF PROBABI we introduce in this chapter, probability, multiplicity, combinatorics, averages, and distribution functions, provide a foundation for describing entropy What Is Probability? Here are two statements of probability. In 1990, the probability that a person in the United States was a scientist or an engineer was 1/250, That is, there were about a million scientists and engincers out of a total of about 250 million people. In 1992, the probability that a child under 13 years old in the United States ate a fastfood hamburger on any given day was 1/30 [1 Let's generalize. Suppose that the possible outcomes or events fall into categories A, B, or C. ‘Event’ and ‘outcome’ are generie terms. An event might be the flipping of a coin, resulting in heads or tails. Alternatively it might be one of the possible conformations of a molecule. Suppose that outcome A occurs 20%6 of the time, B 50% of the time, and C 30% of the time. Then the probability of -Vis 0.20, the probability of B is 0.50, and the probability of C is 0.30. he definition of probability is: If N is the total number of possible out comes, and ny of the outcomes fall into category A, then p, the probability of outcome A, 18 r= (2) a) repose tc pracens fs dtermnsticte atcome has a proba of one Probables canbe computed fo diferent combinations of evens. Com sider one rola ast sided dg tr example de unfortunate 1 the sng sraceynepratabiiy thatatappears faceup s Lo because trea 8 = G poste ctcomnes an ony nant of them isa it suppose you rll SN Pe crc mes Yo mah ask forthe probability that you bse the MeMiece la tue followed by one 4 OF yu may ask forthe prbabiis of Sng uso esa one Gin any order, The rules of probably combina Tole provae te bchinerfrelelating sch probable, Her We deine Definitions: Relationships Among Events MUTUALLY EXCLUSIVE, Outcomes Ay, Az,.--.Ar are mutually exclusive if the occurrence of each one of them precludes the occurrence of all the others. If and B are mutually exclusive, then if A occurs, B does not. If B occurs, AA does not. For example, on a single dic roll, 1 and 3 are mutually exclusive because only one number can appear face up each time the die is rolled. COLLECTIVELY EXHAUSTIVE, Outcomes Ay, Aa,....¢ are collectively eX- haustive if they constitute the entire set of possibilities, and no other outcomes are possible, Vor example, [heads, tails] is a collectively exhaustive set of out comes for a coin toss, provided that you don’t count the occasions when the coin lands on its edge. 3Colors x 2 Models 6 Combinations Figure 1.1. if there are three car colors for each of two car models, there are six different combinations of color and model, so the multiplicity is six. INDEPENDENT. Events 4), 44,..., 4; are independent if the outcome of each one is unrelated to (or not correlated with) the outcome of any other. The scare on one die roll is independent of the score on the nest, unless there is trickery MuuripLicrry, ‘The multiplicity of events is the total number of ways in which different outcomes can possibly occur, If the number of outcomes of ype Ais 174, the number of outcomes of type B is 715, and the number of out comes of type C is ric, the total number of possible combinations of outcomes is the multiplicity W W = m1 gmyane a2 Figure 1.1 shows an example of multiplicity The Rules of Probability Are Recipes for Drawing Consistent Inferences The addition and multiplication rules permit you to calculate the probabilities of certain combinations of events. ADDITION RULE. If outcomes ,8,...,£ are mutually exclusive, and occur with probabilities py = ma/N, py = Mu/N,..-,Pr = Me/N, then the probability of observing either 4 Ok B OX ..., O E (Ihe union of outcomes expressed as AUB ) is the sum of the probabilities: mtn ny piA OR BOR... ORE) r N = Pat Pate ey 3) The addition rule holds only if two criteria are met: the outcomes are mutually exclusive, and we seek the probability of one outcome OR another outcome. When they are not divided by \V, the broader term for the quantities 1), (7 4,B,...E) is statistical weights. If outcomes 4,B,....£ are both collectively exhaustive and mutually exclusive, then RULES OF PROBABILITY 3 4 CHaPrer 1 aay ny tmp terse ne and dividing both sides of Equation (1.4) by 1V, the total number of trials, gives Pat pets + peal (5) MULTIPLICATION RULE. If outcomes A,B,...,E are independent, then the probability of observing A AND B AND... AND E (the intersection of outcomes, expressed as A B +++ 0 £) is the product of the probabilities, prawn Baw asp = (") (M)- (Me) = PAPA Pr- a6) ‘The multiplication rule applies when the outcomes are independent and we seek the probability of one outcome AND another outcome AND possibly other ‘outcomes, A more general multiplication rule, described on page 7, applies even when outcomes are not independent Here are a few examples using the addition and multiplication rules. EXAMPLE 1.1 Rolling a die. What is the probability that either a 1 or a4 appears ona single roll of a die? The probability of a 1 is 1/6. The probability of a4 is also 1/6. The probability of either a 1 on a4 is 1/6 + 1/6 = 1/3, because the outcomes are mutually exclusive (1 and 4 can’t occur on the same roll) and the question is of the OR type. EXAMPLE 1.2 Rolling twice. What is the probability of a 1 on the first roll of a die and a 4 on the second? It is (1/6)(1/6) = 1/36, because this is an AND question, and the two events are independent. This probability can also be ‘computed in terms of the multiplicity. There are six possible outcomes on each of the two rolls of the die, giving a product of WW = 36 possible combinations, ‘one of which Is 1 on the first roll and 4 on the second. EXAMPLE 1.3 A sequence of coin flips. What is the probability of getting five heads on five successive flips of an unbiased coin? It is (1/2)? = 1/32, because the coin flips are independent of each other, this is an AND question, and the probability of heads on each flip is 1/2. In terms of the multiplicity of outcomes, there are two possible outcomes on each flip, giving a product of W = 32 total outcomes, and only one of them is five successive heads. EXAMPLE 14 Another sequence of coin flips. What is the probability of two heads, then one tail, then two more heads on five successive coin flips? Iis phprpf = (1/2)° = 1/32. You get the same result as in Example 1.3 because pi, the probability of heads, and py, the probability of tails, are both 1/2. There are a total of 1” = 32 possible outcomes and only one is the given. sequence. The probability p(ny,.N) of observing one particular sequence of N coin flips having exactly ny heads is rere aa) Pou.N) £2, then p(ny,N) = C. If pa = pr PRINCIPLES OF PROBABILITY EXAMPLE 1.5. Combining events—both, either/or, or neither. If indepen: dent events A and B have probabilities ps and pp, the probability that both events happen is ppp. What is the probability that A happens AND B doe: not? The probability that B docs not happen is (1 ~ py). If A and B are independent events, then the probability that A happens and & does not is, p(1 ~ pp) ~ p4 ~ paps. What is the probability that neither event happens? luis pinot A AND not B) (pa = pad, 8) where p (not A AND not B) is the probability that A does not happen AND B does not happen. EXAMPLE 1.6 Combining events—something happens. What is the prob- ability that something happens, that is, A ox B OR both happen? This is an OK question but the events are independent and not mutually exclusive, so you cannot use either the addition or multiplication rules. You can use a sim- ple trick instead. The trick is to consider the probabilities that events do not happen, rather than that events do happen. The probability that something happens is 1 — p(nothing happens): 1 = pinot A AND not B} (1 paps) = pat Pe pape. (1.9) Multiple events can occur as ordered sequences in time, such as die rolls, br as ordered sequences in space, such as the strings of characters in words, Sometimes it is more useful to focus on collections of events rather than the Individual events themselves. Elementary and Composite Events Some problems in probability cannot be solved directly by applying the addition, or multiplication rules. Such questions can usually be reformulated in terms of composite events to which the rules of probability can be applied. Example 1.7 shows how to do this. Then on page 13 we'll use reformulation to construct probability distribution functions. EXAMPLE 1.7 Elementary and composite events. What is the probability of a 1 on the first roll of a die on a 4 on the second roll? If this were an AND question, the probability would be (1/6)(1/6) = 1/36, since the two rolls are Independent, but the question is of the OR type, so it cannot be answered by direct application of either the addition or multiplication rules. But by redefin- ing the problem in terms of composite events, you can use those rules. An individual coin toss, a single die roll, etc. could be called an elementary event. A composite event is just some set of elementary events, collected together ina convenient way. In this example it’s convenient to define each composite event to be a pair of first and second rolls of the die, The advantage is that the complete list of composite events is mutually exclusive, That allows us to frame the problem in terms of an OR question and use the multiplication and addition rules, The composite events are: RULES OF PROBABILITY 5 6 Cuaprer 1. Principe [aay (1 2F [a 31° lar i 5) U6 (2.1) 12,21 12,3) [2,4i° (2,51 12, 61 (3,1) 13,21 (3,31 13, 41" 13,5] [3, 6] (4.11 14,21 4,3] 14, 4 14,51 [4,61 15,1) 15,21 (5, 31 15,4)" 15, 5) (5,61 (6,1) 16, 2 16, 3) 16, 41° (6,51 6, 61 Phe first and second numbers in the brackets indicate the outcome of the first and second rolls respectively, and * indicates a composite event that satis- fies the criterion for ‘success’ (1 on the first roll or 4 on the second roll). There are 36 composite events, of which 11 are successful, so the probability we seek is 11/36. Since many of the problems of interest in statistical thermodynamics in: volve huge systems (~ 102%), we need a more systematic way to compute com: posite probabilities than enumerating them all ‘To compute this probability systematically, collect the composite events, into three mutually exclusive classes, A,B, and C, about whieh you can ask an OR question, Class 4 includes all composite events with a 1 on the first roll AND anything but a4 on the second. Class # includes all events with anything but a 1 on the first roll AND a4 on the second, Class C includes the one event in which we get a Lon the first roll AND a 4 on the second. A,B, and C are mutually exclusive categories. This is an OR question, so add pa, Ps, and pe to find the answer: pA first on 4 second) = p.(1 first AND anything but 4 second) + pe(anything but 1 first AND 4 second) + pc( I first AND 4 second). (1.10) The same probability rules that apply’ to elementary events also apply to com posite events, Moreover, p, pa, and pe are each products of elementary event s of the die are independent re (6) ()- ve= (6) (6) Add p, pp and pe: pC. first ok 4 second) = 3/36+5/36-+1/36 = 11/36. This ‘example shows how elementary events can be grouped together into composite events so as to take advantage of the addition and multiplication rules, Refor- mulation is powerful because virtually any question can be framed in terms of combinations of AND and OR operations. With these two rules of probability, you can draw inferences about a wide range of probabilistic events. Two events can have a more complex relationship than we have considered so far. They are not restricted to being either independent or mutually exelu- sive. More broadly, events can be correlated. (OF PROBABILITY

Vous aimerez peut-être aussi