Académique Documents
Professionnel Documents
Culture Documents
Overview Methodology: As per the project instructions, I created a series of spreadsheets in Microsoft Excel,
using the formulas provided to generate probability sets detailing the possible multiplicity states and percentage spread of those same states for each set of coins. he numbers !ere graphed and exported to this document for further analysis. he data !as then examined to ans!er the follo!ing "uestions# 1. Are the low multiplicity macrostates prohibited? $o% as a matter of physics and logic, any given state of multiplicity &a certain tally of heads on the tossed coins' is possible( but each state is more or less likely than the others due to the number of possible combinations !hich !ould generate the specific outcome. 2. Are the high multiplicity macrostates certain? $o% as stated above, any particular state &high entropy or lo!' is possible on any particular thro!. hat having been said, multiplicity states !hich can be achieved in more !ays are intrinsically more li)ely to occur. 3. Are low entropy states prohibited? $o% as stated above, there are merely unli)ely( $* prohibited. 4. Are high entropy states certain? $o% as stated above, there are merely more li)ely to occur( any given entropy state is possible in any particular instance of a random system. 5. If you started out with the coins in a lower entropy state (say with 2 heads out of 1 coins! is it possible that you might" if you tossed the coins again" arri#e at a lower entropy state (say with or 1 heads!? +es% any particular state &high entropy or lo!' is possible on any particular thro!. $. Is this li%ely to happen? $o. ,i)ely% by definition is a matter of probability. he fe!er possible means by !hich a particular result can be achieved, the less li)ely that result is to be achieved. &. 'hich of the following would you agree with( A system can never go from higher entropy to lo!er entropy. A system !ill al!ays go in the direction of higher entropy. A system will usually go from a lower entropy state to a higher entropy state. ). 'ould you agree so far with this statement of the *econd +aw of ,hermodynamics( -In a closed system (in this case one where we dont reach in and flip coins over after they come to rest following a toss) entropy must always increase.. /0plain. If you disagree" what would you say instead? I !ould suggest that in a closed system &one free of outside interference', entropy is more li)ely to increase over time, as high entropy results are more probable than lo! entropy results. ,o! entropy results remain possible, ho!ever unli)ely, dis"ualifying the generali-ation made above. 1. 'hat happens to the probability of the lowest entropy states as the number of coins increases? As the number of coins increases, the specific means by !hich the lo!est entropy results may be achieved decrease proportionately. he end result is that the lo!est entropy results become more and more unli)ely.
1 . 'ith a closed system with a lot of coins (or atoms or molecules! would you agree with stating the *econd +aw of ,hermodynamics this way( -Entropy in such a system is not certain to increase with time but very, very likely to do so. +es. 11. *uppose you saw a picture of a set of 1 coins with only 1 of the coins heads up and also a picture of the same set of 1 coins with 4 of the coins heads up" which one would you say occurred first? As a matter of pure probability, the first picture is li)ely to have occurred first. .iven that !e have only the t!o samples, no definitive conclusion is possible. 12. Is it fair to say that entropy shows an arrow of time? +es% but only !ith the proviso that the arro! in "uestion is not infallible. 13. 2oes the arrow become more or less distinct with an increase in the number of coins (or atoms or molecules!? he arro! becomes more distinct !ith increasing number of coins as there are fe!er means by !hich a lo! entropy state can occur. /onversely, the arro! becomes less distinct as time passes. A closed system !ill proceed from a highly ordered state to a state of maximum disorder. Early sampling of the system !ill sho! a &comparatively' clear pattern bet!een order and disorder. ,ater sampling, &as the system nears maximum entropy' !ill be more and more difficult to discern from other recent states. 14. Is there a connection between probability" entropy" the *econd +aw of ,hermodynamics and time? he 0econd ,a! of hermodynamics states that entropy in an isolated &closed' system can never decrease. he premise is that given time, any system !ill naturally decay to a state of thermodynamic e"uilibrium &or maximum entropy'. A gas emptied into a cylinder !ill expand until it reaches the limits of the container. A chemical poured into the ocean !ill intermix !ith the sea !ater until it has reached the maximum dissolution &dispersal' possible. he only re"uirement is time, and as time passes, lo!%entropy results become infinitesimally less li)ely.
1 2
2 3
3 4
5 6
8 "
4 5
6 !
! 8
1 0 0
0#0000000000000000000000000000