I have been looking through the material, the lecturer gave me, concerning the subject of Stochastic Processes. Whilst solving through some of them, I stumbled upon a question (well, actually two subquestions), which I don't see the difference in, however, I think the difference is certainly there, and I am just negligent.
The task says:
Suppose we have the following infinitely descending chain of choices (something like an inverse of a shell game), where at each step you have to choose $1$ of $3$ shells, under which there may be either a red or a black pearl. There is only one red pearl among all three shells. If at some trial you withdraw a red pearl, the game stops.
(Now, to the part, I don't really get.)
- What is the probability that you will never withdraw a red pearl, if you choose the shell randomly at every step.
I may be misteken, but here we just have to count the probability to withdraw black pearls from the shells, which is: $\frac{2}{3}$ for each game trial, and hence, once we infinitely try this, the probability of not withdrawing the read pearl should be $$\lim\limits_{n \to \infty}\left(\frac{2}{3}\right)^n = 0.$$ (I may be mistaken, but anyways)
- What is the probability that there is at least one such eternal game, so that you do not withdraw the red pearl at all?
For me it seems like the question is pretty same to that of (1), but I don't think that the author would have simply separated these two questions.
Can anyone, please, explain me, what is the difference between them, and if they are different, then in what way, and how should I proceed with that?
Thank you a lot in advance!
UPD: I forgot to mention, that you have no way to know, under which shell the red pearl appears, and each next game's pearls arrangement is independent from the previous one.
"What is the probability that there is at least one such eternal game, so that you do not withdraw the red pearl at all?"
This sentence doesn't make that much sense. We don't talk about the probability of mathematical statements, we talk about probabilities of events in the event space.
The claim that "there exists at least one such eternal game, so that you do not withdraw the red pearl at all" is true (satisfied by the event: Black, Black, Black, Black, ...). However, the probability measure of that event is zero.
So, if the author was trying to get at whether the claim is true or not, you could say it's "100% true", so to speak.