Think of a probability distribution (those lines on the following graph). Then, let's think about the concept called "variance."

For dice, the variance, I imagine, is very small (i.e. very tight, e.g. tighter than the pink curve), such that you'll get nearly all players with 3.5 or 3.51 on average (i.e. right on the mode/median part. In the graph, the average is 0, but pretend it's 3.5). Of course, there's some chance that one player would average 3.6, and this probability would be the same for a player with an average of 3.4. But, how many players would fit this description? It depends on the variance.
With high variance, you'd have higher chances of getting individual averages beyond the average of all players. The probability distribution gets fatter (e.g. blue). With low variance, you'd get very low chances of observing a player's average deviating far from the total average. The probability distribution gets thinner (e.g. pink).
I'm assuming that the dice generate some probability distribution of very low variance, but I can't explain why. Someone more knowledgeable of statistics (and who gives a shit to spend the time explaining) will have to step up.