Posted by on Nov 19, 2011 in Entropy | 3 comments

## Entropy (Part 1): Randomness by rolling two dice

To understand entropy, I roll dice. I start with two, then move to three, four, ten and then Avogadro’s constant of dice, and roll them randomly.

Posted by on Nov 18, 2011 in Entropy | 0 comments

## Entropy (Part 2): Randomness by rolling three dice

it is suggested the difficulty students have in understanding that entropy is a measure of randomness can be approached by rolling dice. In the first entry two dice were rolled but in that case there are only 36 arrangements and 10 outcomes (rolls from 2 to 12). This does not show that the most random state dominates (i.e. the one with most number of arrangements consistent with a roll of 7) . To show that more dice need be rolled. In this entry three dice are shown to have more randomness in the outcomes (3 to 18).

Posted by on Nov 17, 2011 in Entropy | 0 comments

## Entropy (part 3): Randomness by rolling four dice

The basic idea is that a physical system has many different arrangements (states) of particles which are consistent with some macroscopic quantity, like the temperature. Boltzmann found that out of all possible ways those particles can be arranged, only those that are consistent with the actual temperature need be considered. The chance of any other arrangements is negligible in comparison. Rolling dice illustrates this nicely.