Entropy (Part 1): Randomness by rolling two dice
To understand entropy, I roll dice. I start with two, then move to three, four, ten and then Avogadro’s constant of dice, and roll them randomly.
Read MoreTo understand entropy, I roll dice. I start with two, then move to three, four, ten and then Avogadro’s constant of dice, and roll them randomly.
Read Moreit is suggested the difficulty students have in understanding that entropy is a measure of randomness can be approached by rolling dice. In the first entry two dice were rolled but in that case there are only 36 arrangements and 10 outcomes (rolls from 2 to 12). This does not show that the most random state dominates (i.e. the one with most number of arrangements consistent with a roll of 7) . To show that more dice need be rolled. In this entry three dice are shown to have more randomness in the outcomes (3 to 18).
Read MoreThe basic idea is that a physical system has many different arrangements (states) of particles which are consistent with some macroscopic quantity, like the temperature. Boltzmann found that out of all possible ways those particles can be arranged, only those that are consistent with the actual temperature need be considered. The chance of any other arrangements is negligible in comparison. Rolling dice illustrates this nicely.
Read MoreFor 10 dice there are over 60 million arrangements and Figure 1 shows the outcomes for 30,000 rolls.
Read More