I have often heard a very counterintuitive problem in statistics:
Apparently, the average "number of times" (not the "probability") required to roll a dice before observing a 4 and 6 is different from the average number of times required to roll a 6 and a 6.
Although, the probability of rolling any number on the dice is equal.
Apparently, this question can be answered by creating a markov chain and it can be shown that the average number of rolls required for 4 and a 6 is soomething like "36" compared to a 6 and a 6 is "42".
Has anyone heard of this before? Is it true? Could there be any aditional explanations to why this might be true, other than the "the math just supports this conclusion"?
Thanks!
To get 6 and 6 there is only one possibility, while for 4 and 6, there are two ways depending on which came first.