Are there any practical implications of Infinite Monkey Theorem?

614 Views Asked by At

I've recently come across the Infinite Monkey Theorem which loosely states that if you gave a monkey a typewriter and an infinite amount of time, it would almost surely type out any given finite text (say, a Shakesperian sonnet).

I understand also that the "monkey" is a placeholder for some notion of an abstract machine that continually produces random strings of symbols (say, letters and numbers).

What are the implications of this theorem? I find the theorem fascinating, but am really not sure how (if at all) it has contributed to other branches of mathematics, real world applications, or academia in general. Any insight would be appreciated!

3

There are 3 best solutions below

0
On

Suppose that the monkeys' typewriters were limited to the 26 letters of the English alphabet and the space bar for a total of 27 symbols. The number of sequences of length $k$ that could be formed by typing at random (and for simplicity let's assume the random distribution is a uniform distribution: all 27 keys have the same probability of being struck, namely $1/27$) is $27^k$.

If we set $k:=80$, which would be reasonable for a single line of monkey prose, then there are $27^k$ lines possible, all of them equally likely given our assumption that the underlying distribution for each character chosen is uniformly random. There are roughly $10^{97}$ particles in the universe, and $27^{80} > 10^{97}$, so there are more 80 character monkey line combinations than there are particles in the universe. Most of them won't even parse in English, let alone be Pulitzer prize-winning prose. So, in that specific instance at least, not very practical.

Nevertheless, my specific example is just that - an example. Thus is doesn't conclusively disprove the monkeys' futility in general, so I'm curious to hear what others think.

0
On

The "Infinite Monkey Theorem" is a thought experiment to illustrate the Borel 0-1 law/Borel-Cantelli Lemma. This Lemma says that for a sequence of events $(A_n)_n$ living in a single probability space, we have

i) If $\sum_{n} \mathbb{P}(A_n)< \infty$, then $\mathbb{P}(\limsup_n A_n)=0$

ii) If $\sum_{n} \mathbb{P}(A_n)= \infty$, and the events $A_n$ are independent, then $\mathbb{P}(\limsup_n A_n)=1$.

The $\limsup_n A_n$ is just the event in which a infinite number of $A_n$'s happening. Notice that the "infinite monkey" refers to events $\mathbb{P}(A_n)=p>0$ and we assume independence as well, so it is a rather simple case of the duality presented above. I don't think it is fair to ask if such example itself is useful.

However, the Borel 0-1 Law itself is rather important result in Probability Theory. A professor of mine told me "this is one of the only 2 or 3 things you need to understand probability". In fact, there are many applications where we want to show that a property holds almost surely by translating it into a sequence of simpler events. We then prove examine $\sum_{n} \mathbb{P}(A_n)$ to conclude whatever you need. True, it is more common to use the statement i) instead of ii) (as often such simpler events are not necessarily independent), but I think it still falls under the same big picture.

As for an example, I suggest the proof of the Kolmorogov Continuity Theorem, which is required to prove that the Brownian Motion is a continuous function almost surely. The Brownian Motion is one of the most important objects in probability theory and has a wide range of applications, from financial maths to physics.

1
On

Given an alphabet of size $n$, and a text with $k$ characters in it, the probability of typing this text in $k$ characters is:

$$\frac{1}{n^k}$$

If we allow $t$ attempts to write the text, the probability becomes

$$\frac{1}{n^k}\cdot\left(1+\left(1-\frac{1}{n^k}\right)+\left(1-\frac{1}{n^k}\right)^2+\dots+\left(1-\frac{1}{n^k}\right)^{t-1}\right)$$

and as $t\to\infty$, the probability tends to:

$$\frac{1}{n^k}\cdot\frac{1}{1-(1-\frac{1}{n^k})}$$

which equals $1$.

This can be used to prove Murphy's Law:

"Anything that can go wrong will go wrong."