Johnson Lindenstrauss lemma for positive entries matrix

18 Views Asked by At

I read about the Johnson Lindenstrauss lemma, which (roughly) states that if $x \in \mathbb{R}^n$ and $A \in \mathbb{R}^{k \times n}$ is a "random matrix", then with high probability $\|x\|_2 \approx \|Ax\|_2$.

The proof goes by assuming that $A_{i,j}$ are i.i.d random variables with mean $0$ and variance $1$, and then $$\mathbb{E}[\|Ax\|^2] = \sum_{i=1}^n \mathbb{E}[\langle A_i, x\rangle^2] = n\mathbb{E}[\sum_{i=1}^n A_{i,i}^2x_i^2] + \mathbb{E}[\sum_{k \neq j} A_{i,j}x_jA_{i,k}x_k] ,$$ and since the random variables are independent with mean $0$, the second term cancels out and we get that the expectation is the norm of $x$ (up to normalization). Then, we can use standard concentration inequalities to obtain the result (and get precise bounds on $k$).

I was wondering if there is any way to make the same thing work but with a matrix whose entries are all positive, hence the expectation of $A_{i,j}$ won't be $0$. For example, take a $\{0,1\}$ matrix, where $\mathbb{E}[A_{i,j}] = \mu$, for some small $\mu$. Is there any chance to make it work? The computation I did gives ($A_1$ is the first row of $A$, since everything is i.i.d it doesn't matter which row to take for this computation):

$$\mathbb{E}[\langle A_1, x\rangle^2] = \sum_{i=1}^n \mathbb{E}[A_{1,i}^2]x_i^2 + \sum_{i \neq j}\mathbb{E}[A_{1,i}A_{i,j}x_ix_j] = \mu\|x\|_2 + \mu^2\sum_{i\neq j}x_ix_j.$$

So the second term doesn't cancel out, and in general I think might even be large (larger than $\|x\|_2$), so the expectation is "wrong". Is there any way to fix it, and make a similar result with a zero one matrix?