Simple statement in the elementary proof of the Johnson-Lindenstrauss lemma (random projections)

192 Views Asked by At

In the simple proof of the johnson lindenstrauss lemma written by Sanjoy Dasgupta, Anupam Gupta that can be found here they state the following (p.$62$):

Repeating this projection $O(n)$ times can boost the success probability to the desired constant, giving us the claimed randomized polynomial time algorithm.

My idea is to see that the success probability of a single trial is $1/n$ thus the success probability of atleast one trial out of $n$ trials is $1 - (1/n)^n$ if we then set it to be larger than $0.95$ we end up with: $0.05 > (1 - 1/n)^n$ thus $\log(0.05) > n\log(1 - 1/n)$ but i think this way of proving it is wrong.

I would really like to understand why the statement is true.

Could someone help me bring some clarity into this?

1

There are 1 best solutions below

0
On

Recall that $$1+x \leq e^x$$ for all real $x$.

If we run $cn$ independent trials, where $c$ is some constant, the probability of at least one success is

$$1-(1-P(\text{success}))^{cn}\geq 1-\left(1-\frac{1}{n}\right)^{cn}\geq1-e^{-c},$$

where the last line follows from the original inequality, letting $x=-\frac{1}{n}$.

For $c$ large enough, we can make the probability of at least one success as close to $1$ as we like, and this only requires $cn\in O(n)$ trials.