I think you are wrong, the normal dice are the precision dice. Non-precision dice are not allowed at casinos or at some ASL tournamemt due to the low production process.

ASL is not a commercial thing like running a casino is. If you want to calculate your profits beforehand or large sums of money are involved, precision dice might help.

And luckily, there is also nothing otherwise important at stake when playing ASL. It is a game. If people take the game too seriously, so that the

*difference* of precision dice and normal dice would really have become important, then something is wrong.

So let me pose the question to you:

Have you ever made the test to establish what the difference of the bell curves are when comparing any (low production process) set of dice as found in any of MMP's boxed core modules and any set of your precision dice based on a sample of 1000 DRs? Or has anyone else?

I do not want to tease you or anyone else. I am honestly interested in establishing what extent of difference we are talking about. If we have some data available, then our arguments either way may gain or lose some weight.

To put facts behind my words, I did the work. I am not into math, so I leave it to the number-crunchers to figure out the results that can be pulled from the data:

Sample of 1000 DRs of BV3 dice rolled the way I usually do (no dice tower, dice cup on desk):

DR number of times my dice number of times expected with "perfect" die
2 24 27.77777777
3 46 55.55555555
4 79 83.33333333
5 107 111.111111111
6 138 138.888888888
7 159 166.666666666
8 141 138.888888888
9 125 111.111111111
10 80 83.3333333333
11 72 55.5555555555
12 23 27.7777777777
My DR average in 1000 rolls is 7099, the "perfect" expected DR average would be 7000.

I believe up to now, nobody considers whining about the dice or is convinced that the injustice of it all would warrant the

*mandatory requirement* of precision dice.

From here on, please, math-gurus, correct me if I have screwed up somewhere. As abovementioned, I am not into math.

**Emiprical variance** for my dice would be:

24 * (2-7.099)² +

46 * (3-7.099)² +

49 * (4-7.099)² +

107 * (5-7.099)² +

138 * (6-7.099)² +

159 * (7-7.099)² +

141 * (8-7.099)² +

125 * (9-7.099)² +

80 * (10-7.099)² +

72 * (11-7.099)² +

23 * (12-7.099)² = 6199.82

6199.82 / 1000 (i.e. # of sample) = 6.19982 = Empirical variance for my dice @ a 1000 sample.

Now Empicial variance for "perfect" dice:

27.77 * (2-7)² +

55.55 * (3-7)² +

83.33 * (4-7)² +

111.11 * (5-7)² +

138.88 * (6-7)² +

166.66 * (7-7)² +

138.88 * (8-7)² +

111.11 * (9-7)² +

83.33 * (10-7)² +

55.55 * (11-7)² +

27.77 * (12-7)² = 5833.33

5833.33 / 1000 = 5,83333 = Empicical variance for "perfect" dice @ a 1000 sample.

**Standard deviation** would be the square root of

**Empirical variance**.

√6.19982 = 2.48994 is the Standard deviation for my dice @ a 1000 sample.

√5.83333 = 2.41523 is the Standard deviation for "perfect" dice @ a 1000 sample.

If I understand it correctly, this would mean that with my dice @ a 1000 sample, I can expect to roll

7099 +/- 2.48994% or in other words between 6998.26 and 7147.27.

"Perfect" dice @ a 1000 sample could expect to roll

7000 +/- 2.41523% or in other words between 6915.47 and 7084.53.

Well, so with my low production process dice my sample is exceeding the Standard Deviation of "perfect" dice by 14.5 pips in 1000 DRs (factual 7099 as opposed to 7084.53 what is within Standard Deviation for "perfect dice"). What a scandal. I am devastated.

Now granted,

**worst that could conceivably be expected to happen to me** within the range of the Standard deviation of "perfect" dice pitted against my low production process BV3 ones could be a

**DR average of 6.916 against 7.147 in 1000 DRs to my disfavor**.

I am crushed. Oh, the injustice of it. It must have turned the game against me. Seriously? I better not even comtemplate that

**at the same time within expectations it could have optimally turned out** to be a

**DR average of 6.998 against 7.085 in 1000 DRs to my favor.**

At this point, I find it very hard to understand people insisting upon precision dice making a

*significant* difference.

Have I just been lucky with the set of low production process dice of BV3?

Not impossible, but rather unlikely I would think. Everybody, please feel invited to provide your 1000 DR sample to broaden the database. And note that I have pitched my dice against "perfect" dice. Not against precision dice which will fall off against "perfect" dice somewhat.

So as a bottom line based on the above, I conclude that precision dice do not matter for the purposes of ASL. What they might matter for is to soothe superstition, the promise of more "security" and so on. They obviously calm the psyche of some. I perfectly fine with people who see the need to use precision dice for whatever reason.

But I cannot follow people that try to tell me using precision dice or not has

*in itself* any significant impact on the game and therefore their use should be made mandatory for tournaments etc. Even less I can accept people that allege dishonest motives if people do not use or decline their use.

**DR average of 6.916 against 7.147 in 1000 DRs to my disfavor**.

**DR average of 6.998 against 7.085 in 1000 DRs to my favor.**

I am convinced that the above differences are

*not* what usually decides tournaments.

It's the

*differences of the skill levels of players* that do which deviate at a greater extent as to allow for a

*significant* impact of the above DR average differences.

And it's

*when* you roll crap or superb that matters.

Cheers,

von Marwitz