Suppose $M_n$ is an $n\times n$ matrix, with independent entries $$M_n(i,j) $$ being $0,1$ Bernoulli random variables, each with $$\mathbb{P}(M_n(i,j)=1)=\dfrac{\ln n}{n} $$
How do we show that as $n\to \infty$, the probability that at-least one column is zero is bounded away from $0$ (i.e. with possitive probability)?
We can use the inclusion and exclusion principle to get a series$$\lim_{n\to \infty}\sum_{i=1}^{n}(-1)^{i-1}\binom{n}{i}(1-\dfrac{\ln n}{n})^{ni} $$ and use mathematica the first term in the series $$n(1-\dfrac{\ln n}{n})^n\to 1 $$ We can probably use the alternating series test to make the conclusion. But I have a hard time computing the above limit( limit of the first term) (and the limit of the other terms in the series ) by hand.
Any help will be appreciated.
Fix any column. The probability that it is all zero is exactly $$ \left(1-\frac{\ln n}{n}\right)^n $$ (can you see why?) and so the probability that it is not all-zero is exactly $$ p \stackrel{\rm def}{=} 1- \left(1-\frac{\ln n}{n}\right)^n $$ Since the $n$ columns are independent, the probability that all columns are all-zero is exactly $$ p^n = \left(1- \left(1-\frac{\ln n}{n}\right)^n\right)^n $$ You want to bound that away from $1$. This looks unwieldy, but you can do a Taylor expansion to compute the limit. As $n\to\infty$, $$ \left(1-\frac{\ln n}{n}\right)^n = e^{n \left(1-\frac{\ln n}{n}\right)} = e^{n\left(-\frac{\ln n}{n}+O\left(\frac{\ln^2 n}{n^2}\right)\right)} = e^{-\ln n+o(1)} = \frac{e^{o(1)}}{n} = \frac{1+o(1)}{n} $$ and so $$ p^n = \left(1- \frac{1+o(1)}{n}\right)^n = e^{n\ln\left(1- \frac{1+o(1)}{n}\right)} = e^{n\left(-\frac{1+o(1)}{n}+o\left(\frac{1+o(1)}{n}\right)\right)} = e^{-1+o(1)} \xrightarrow[n\to\infty]{} e^{-1} < 1. $$ Therefore, the probability that at least one column is all-zero, $1-p^n$, is bounded away from zero as $n\to\infty$.