Pages Menu
TwitterRssFacebook
Categories Menu

Posted by on Dec 5, 2011 in Physical Chemistry | 2 comments

Entropy (Part 4): Randomness by rolling ten dice

In order to illustrate the concept of randomness as it pertains to entropy, in a series of entries different numbers of dice have been rolled.

Entropy 1: Randomness by rolling two dice

Entropy 2: Randomness by rolling three dice

Entropy 3: Randomness by rolling four dice

A die with six random states is used to illustrate a particle, so as the number of dice increases, so the number of states increases. For n dice  there are 10n different ways they can be rolled.   The roll that comes up most frequently is the one that has the most number of arrangements.  As the number of dice increases, that random states becomes more and more likely as seen in the above entries for 2, 3 and 4 dice.  Now we jump to 10.

For 10 dice there are over 60 million arrangements (610) and Figure 4 shows the outcomes for 30,000 rolls. This can be compared to Figures 1 to 3.

For ten dice, the chance of a number lower than 20 or greater than 60 is negligible.  The chance of rolling 10 one’s is one over 60 million. The most random states are dominating.  This is only for 10 dice. Next case will be Avogadro’s dice which have 61023 states, which is a lot more than 60 million.

fig 1 Entropy

Figure 1

 

 

 

fig 2 Entropy

Figure 2

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

fig 3 Entropy

Figure 3

fig 4 Entropy

Figure 4


The interactive software used in this video is part of the General Chemistry Tutorial and General Physics Tutorial, from MCH Multimedia. These cover most of the topics found in AP (Advanced Programs in High School), and college level Chemistry and Physics courses.

What next? Make sure you don't miss out on new posts and special offers!

* indicates required
Are you a Physical Chemistry teacher or student?
Latest Edition Physical Chemistry Textbook

2 Comments

  1. I’m not so sure that the classical example of probabilistic randomness shown above for the distribution of material objects, which have the same energy in any configuration (unless dice are marked), may still be considered as the best one for entropy.
    In entropy changes, redistribution of energy among interacting entities is involved, whereas there is no interaction (or exchange of communication) between two dice; otherwise, it would be possible to win just betting on “retarding” numbers or combinations. Molecules in the two bulbs connected by a stopcock, instead, are mutually exchanging energy, and in this way entropy increases.
    I think that to be coherent with the physical definition of entropy, a more comprehensible approach is the one suggested by prof. Frank A. Lambert, who discards classical examples of random cards or messy teenager rooms in favour of the accessibility of a wider number of energetic states. It gives me a better understanding also of the Boltzmann’s equation, with the uncountable value that W reaches just at some kelvin above absolute zero. Which are your opinions about it?

  2. Of course statistical entropy is treated as ensembles of particles that are constrained by energy and other parameters. The dice are constrained by the number rolled and the number of faces. Let us suppose that if the temperature increases, the number of dice faces increases and vice versa. Within these constraints the number of accessible states changes and the random states dominate if there are enough. Constraints are not the point of these entries. The point is to say that the number of accessible states is dominated by the most random states, and for this reason entropy is a quantitative measure of randomness. I looked at some of Frank A. Lambert’s postings and questions and answers. I do not think it much different from my ideas at equilibrium. It is just that I only want to show that as the number of non-interacting states increases (by non-interacting I mean rolls of (5,1), (1,5),(2,5),(5,2),(4,3), (3,4) are independent and degenerate (outcome 7). In the two bulb experiment, the gases will redistribute amongst the newly accessible states even if the particles do not interact.

    I discuss entropy in more detail in our text book, Laidler, Meiser and Me, Physical Chemistry, http://www.mchmultimedia.com/store/Statistical-Mechanics.html .

    Does this clarify my motivation? I would like to know if this is consistent with the views you might have?

Post a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>