Posted by on Nov 19, 2011 in Entropy | 3 comments

# Entropy (Part 1): Randomness by rolling two dice

Students have trouble with the concept of entropy.  We tell them that entropy is a (quantitative) measure of the randomness of a system but what does that really mean, and how do we explain it clearly?  We can discover entropy as a state function like Carnot did, and then study it by measuring different systems.  Of course we find entropy always increases for spontaneous processes leading to the second law of thermodynamics.

To me, entropy is a substance as tangible as energy.  One can use equilibrium statistical mechanics and either minimizes the energy or maximize the entropy to arrive at the same conclusions. Entropy is the essence of the second law, so it is essential that the concept of randomness be clear.

There are many ways we try to explain the notion of the most random state.  For example two bulbs connected by a stopcock are common; one evacuated and the other full; open the stopcock, talk about the number of accessible states increasing, and the system moves to occupy them–the random state. Then we explain that there is a small but finite probability that the gas stays on one side even with the stopcock open, but the chance is so small it never happens. I think such approaches tend to confuse students. They need to visualize randomness.

So what I do is roll dice. I start with two, then move to three, four, ten and then Avogadro’s constant of dice, and roll them randomly.  In all cases, when the number of rolls becomes statistical, the distribution is Gaussian or Normal. As the number of dice increases the random states start to dominate until the chance of less-random, ordered, states becomes negligible.

I think this is easier to visualize.  So in this blog I will roll 2 dice and in the next four I will roll 3, 4, 10 and Avogadro’s constant of dice.  Finally I will say a few words about ensembles.

Consider a die represents a particle with 6 states and any one of those states comes up randomly. So one die has 6 equally probable outcomes.  However two dice have 36 ways of rolling but only 10 outcomes (2 to 12). Therefore there are more ways to roll some outcomes that others.  Whereas there is only one arrangement (snake eyes) that give a 2, there are 6 arrangements that give a 7.  [(6,1), (1,6),(5,2),(2,5),(4,3),(3,4),] We will denote these arrangements by W, like Boltzmann did. So for two dice W = 36.

Also the concept of arrangements being consistent with an outcome becomes clearer. There are six for a roll of 7, five for a roll of 6 or 5, and so on. The analogy is then made to an ideal gas: for a temperature of, say, 300 K, the number or arrangements of the molecules from the total number possible are consistent with that temperature is analogous to asking how many arrangements there are for a roll that gives a 7.

So when the 2 dice are rolled, the state with the most randomness, 7, comes up 6 times more often than a 2 or a 12. The more random a state is, the more probable it becomes because it has more arrangements that are consistent with a 7 than any other roll.

The result is that as the number of dice increases, then at some point the probability of the random states starts to dominate all others and finally one can replace all the states by only the random states.  That is, in the limit of a large number of dice, W is essentially equal to Wrandom.  But this is not true for two dice with only 36 arrangements. To get to that point a lot of dice (or a lot of molecules) are needed .

Here are the results:

Rolling two die 2000 times (Figure 1 (click graph to enlarge)):  This is not quite statistical as you can see from the bars that are not exactly on the horizontal lines (this is luck in gambling), but it makes the point.  Notice the number of hits is greatest for the number 7 (chance is 6/36) and smallest for a 2 or 12 (chance of 1/36).

In figure 2, you can see the “frequency table”: the total number of rolls is recorded and the number of hits of a certain number is noted. The probability is simply the ratio of the two.

In the next entry I will roll three dice.

In the meantime, check out this Song by Flanders and Swan on Thermodynamics

The interactive software used in this video is part of the General Chemistry Tutorial and  General Physics Tutorial,  from MCH Multimedia. These cover most of the topics found in AP (Advanced Programs in High School), and college level Chemistry and Physics courses.

# 3 Comments

1. Is statistical entropy *really* the same as chemical entropy? I don’t understand why disorder or randomness would be connected to the energy of a system.

2. I am not sure I understand your point. Bonds involve energy and this leads to more or fewer particles. These become ordered or disordered and entropy is a quantitative measure of disorder (or randomness). I am just trying to show with my posts here that the most probable state is the most random and we can ignore all others.

Perhaps you can be more specific?

Thanks

3. Hi Eric
Thank you for your comments which are indeed very interesting in light of my recent work. I will certainly look at your website as soon as I can and get back to you with any comments.