A common illustration of the nature of infinity is that, given an infinite amount of time, a monkey on a typewriter will, with probability $1$, produce the complete works of Shakespeare.
Consider now a (very clever) monkey that is able to choose, with uniform probability, real numbers from some interval, say $(0,1$). In an infinite time, is it correct to state that the monkey will eventually choose any given number (e.g. $\sqrt{2}/2$) with probability $1$?
Does the fact that the cardinality of $\mathbb{R}$ is larger than that of $\mathbb{N}$ come into the equation?
I have tried understanding it as follows. Consider the probability that the monkey chooses a number in $(\sqrt{2}/2-\epsilon/2, \sqrt{2}/2+\epsilon/2)$, which is $\epsilon$. The probability that in $N$ trials the monkey has not chosen a number in this interval is $(1-\epsilon)^N$. Does the question therefore boil down to evaluating:
$$\lim_{\substack{N\rightarrow\infty\\ \epsilon\rightarrow 0}}(1-\epsilon)^N$$
Does this limit exist? Is it $0$?
Follow up question: what if the monkey is now free to choose from an unbounded interval, like $\mathbb{R}$?
Presumably the monkey only chooses countably many numbers. You can adapt the proof that the measure of $\Bbb Q$ is zero to show that the chance any given real is chosen is still $0$. Choosing from an unbounded interval doesn't change anything. To show that, take your favorite bijection between $(0,1)$ and $\Bbb R$
Specifically for your limit, you need to define how the two limits are taken. If you let $N \to \infty$ first, the limit is zero. If you let $\epsilon \to 0$ first, the limit is $1$. If you let $\epsilon=\frac 1N$ and they go together, you get $\frac 1e$. You can get anything on $[0,1]$ you like.