Pages Menu
TwitterRssFacebook
Categories Menu

Posted by on Dec 12, 2011 in Physical Chemistry | 0 comments

Entropy (Part 5): Randomness by rolling Avogadro’s dice

After rolling 2, 3, 4 and 10 dice, as seen in the entries below, it becomes clear that the most random state (most number of ways of rolling a number) always dominates while those with fewer arrangements occur less frequently:

Entropy (Part 1): Randomness by rolling two dice

Entropy (Part 2): Randomness by rolling three dice

Entropy (Part 3): Randomness by rolling four dice

Entropy (Part 4): Randomness by rolling ten dice

What about Avogadro’s constant say 1023 dice? The trend is now clear.  There as so many random states that they completely swamp all others. We find a single spike:

fig 1 Entropy

 

In other words, with this many dice, you can roll them as much as you want, and the chance that there is an outcome other than the one that corresponds to the position of the spike is so unlikely you can safely ignore them. Just for ease of writing, consider a ten sided dice. Then for 1023 such 10 sided dice the chance they all come up 1 is, 10-1023 . This number is so tiny that it is all but zero. We can ignore these states.

The blow-up of the base of the figure shows that a few non-random states are possible and these are fluctuations.

The jump to entropy being a measure of the randomness is easy and the famous expression on Boltzmann’s head stone can be understood, where W is, again, the total number of accessible random states, and k if Boltzmann’s constant. I like to make an analogy between Planck’s constant that determines the smallest quantum of energy and Boltzmann’s constant that determines the smallest change in entropy.

Note that as soon as states couple, add more dice, then the number of random states increases, and the system moves into a new most probable outcome. The same happens with entropy. Open the stop cock and the evacuated bulb is filled as the particles move in to occupy those newly accessible states. W increases enormously and so does the entropy, according the Boltzmann’s equation.

Including non-random states makes no difference, (unless we are at a situation where fluctuations are large, like at phase transitions), but that is not the point here.

Collections of all those random states are called ensembles and they are discussed next.


The interactive software used in this video is part of the General Chemistry Tutorial and General Physics Tutorial, from MCH Multimedia. These cover most of the topics found in AP (Advanced Programs in High School), and college level Chemistry and Physics courses.

Post a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>