We roll a six-sided die ten times. What is the probability that the total of all ten rolls is divisible by 6?

29k Views Asked by At

So the question is really hard I think. I tried using a simple way by calculating the probability of each combination that makes a sum divisible by six, but it would take forever. Does anyone have any ideas?

Suppose that we roll a six-sided die ten times. What is the probability that the total of all ten rolls is divisible by six?

7

There are 7 best solutions below

13
On BEST ANSWER

Hint.

Roll $9$ times and let $x$ be the total.

For exactly one number $n\in\{1,2,3,4,5,6\}$ we will have $6 \mid (x+n)$ (i.e. $x+n$ is divisible by $6$).

3
On

Couldn't you also think of it as the maximum possible value you could get for rolling the die 10 times would be 60, how many numbers between 1 and 60 are divisible by 6? 10 numbers are divisible by 6(6*1, 2, 3, etc.) So, 10 out of 60 possible values gives... 1/6. I love math.

6
On

There are 3 variables in this case:

  • the number of sides of the dice: s (e.g. 6)
  • the number of throws: t (e.g. 10)
  • the requesed multiple: x (e.g. 6)

In this case, the conditions are simple:

  • s>=x
  • x >0
  • t > 0

And also the answer is simple: Throwing a sum that is a multiple of 6 has a 1/6 probability.

$P(s,t,x) = 1/x$

For situations where s<x this is not entirely correct. It approaches the same result though, at a high amount of throws. Example: If you throw a 6-sided dice 30 times the chance that the sum is a multiple of 20 will be about 5%. Proving this is a bit of a challenge.

$\lim \limits_{t \to \infty} P(s,t,x) = 1/x$,

Nevertheless, if programming is an acceptable proof:

public static void main(String[] args) {
    int t_throws = 10;
    int s_sides = 6;
    int x_multiple = 6;
    int[] diceCurrentValues = new int[t_throws];
    for (int i = 0; i < diceCurrentValues.length; i++) diceCurrentValues[i] = 1;

    int combinations = 0;
    int matches = 0;
    for (; ; ) {
        // calculate the sum of the current combination
        int sum = 0;
        for (int diceValue : diceCurrentValues) sum += diceValue;

        combinations++;
        if (sum % x_multiple == 0) matches++;
        System.out.println("status: " + matches + "/" + combinations + "=" + (matches * 100 / (double) combinations) + "%");

        // create the next dice combination
        int dicePointer = 0;
        boolean incremented = false;
        while (!incremented) {
            if (dicePointer == diceCurrentValues.length) return;
            if (diceCurrentValues[dicePointer] == s_sides) {
                diceCurrentValues[dicePointer] = 1;
                dicePointer++;
            } else {
                diceCurrentValues[dicePointer]++;
                incremented = true;
            }
        }
    }
}

EDIT:

Here's another example. If you throw a 6-sided dice 10 times, there is 1/4 probability that the sum is a multiple of 4. The program above should run with the following parameters:

    int t_throws = 10;
    int s_sides = 6;
    int x_multiple = 4;

The program will show the final output: status: 15116544/60466176=25.0% That means that there are 60466176 combinations (i.e. 6^10) and that there are 15116544 of them where the sum is a multiple of 4. So, that's 25% (=1/4).

This just follows the formula as mentioned above (i.e. P(s,t,x) = 1/x). x is 4 in this case.

1
On

After rolling the die once, there is equal probability for each result modulo 6. Adding any unrelated integer to it will preserve the equidistribution. So you can even roll a 20-sided die afterwards and add its outcome: the total sum will still have a probability of 1/6 to be divisible by 6.

4
On

If you want something a little more formal and solid than drhab's clever and brilliant answer:

Let $P(k,n)$ be the probability of rolling a total with remainder $k$ when divided by $6, (k = 0...5)$ with $n$ die.

$P(k, 1)$ = Probability of rolling a $k$ if $k \ne 0$ or a $6$ if $k = 6$; $P(k, 1) = \frac 1 6$.

For $n > 1$. $ P(k,n) = \sum_{k= 0}^5 P(k, n-1)\cdot \text{Probability of Rolling(6-k)} = \sum_{k= 0}^5 P(k, n-1)\cdot\frac 1 6= \frac 1 6\sum_{k= 0}^5 P(k, n-1)= \frac 1 6 \cdot 1 = \frac 1 6$

This is drhab's answer but in formal terms without appeals to common sense

0
On

In spite of all great answers, given here, I say, why not give another proof, from another point of view. The problem is we have 10 random variables $X_i$ for $i=1,\dots,10$, defined over $[6]=\{1,\dots,6\}$, and we are interested in distribution of $Z$ defined as $$ Z=X_1\oplus X_2\oplus \dots \oplus X_{10} $$ where $\oplus$ is addition modulo $6$. We can go on by two different, yet similar proofs.


First proof: If $X_1$ and $X_2$ are two random variables over $[6]$, and $X_1$ is uniformly distributed, sheer calculation can show that $X_1\oplus X_2$ is also uniformly distributed. Same logic yields that $Z$ is uniformly distributed over $[6]$.

Remark: This proves a more general problem. It says that even if only one of the dices is fair dice, i.e. each side appearing with probability $\frac 16$, the distribution of $Z$ will be uniform and hence $\mathbb P(Z=0)=\frac 16$.


Second proof: This proof draws on (simple) information theoretic tools and assumes its background. The random variable $Z$ is output of an additive noisy channel and it is known that the worst case is uniformly distributed noise. In other word if $X_i$ is uniform for only one $i$, $Z$ will be uniform. To see this, suppose that $X_1$ is uniformly distributed. Then consider the following mutual information $I(X_2,X_3,\dots,X_6;Z)$ which can be written as $H(Z)-H(Z|X_2,\dots,X_6)$. But we have: $$ H(Z|X_2,\dots,X_6)=H(X_1|X_2,\dots,X_6)=H(X_1) $$
where the first equality is resulted from the fact that knowing $X_2,\dots,X_6$ the only uncertainty in $Z$ is due to $X_1$. The second equality is because $X_1$ is independent of others. Know see that:

  • Mutual information is positive: $H(Z)\geq H(X_1)$
  • Entropy of $Z$ is always less that or equal to the entropy of uniformly distributed random variable over $[6]$: $H(Z)\leq H(X_1)$
  • From the last two $H(Z)=H(X_1)$ and $Z$ is uniformly distributed and the proof is complete.

Similarly here, only one fair dice is enough. Moreover the same proof can be used for an arbitrary set $[n]$. As long as one of the $X_i$'s is uniform, then their finite sum modulo $n$ will be uniformly distributed.

0
On

Roll the die 9 times and add up the dots. The answer is x. Roll the die one more time. add the number thrown to x to get one and only one of the following answers; x+1, x+2, x+3, x+4, x+5 or x+6. since these answers are six sequential numbers one and only one of them will be divisible by six. Therefore the probability of the sum of ten rolls of a die being divisible by six is exactly 1/6.