There seemed to be a mistake in the question, so it was edited (my answer is correct).
A matrix is monochromatic if it's built from only $0$'s or only $1$'s.
Prove that for every $n,k$ such that $n\geq k>2log(n)+1$ , there exists an $n\times{n}$ matrix built of $0$'s and $1$'s, that does not contain a monochromatic $k\times{k}$ sub-matrix.
Let $m$ be a $k\times{k}$ matrix.
Let $A_m$ denote the event that m is a $k\times{k}$ monochromatic sub-matrix.
And let $A=\cup A_m$ (All $k\times{k}$ monochromatic matrices)
We need to prove $P(A)<1$. Then we conclude $P\big( \overline A \big)>0$.
$P(A)=P(\cup A_m)\leq \sum P(A_m)$
For $A_m$ to be a monochromatic matrix, all the entries must be the same. And since the underlying matrix is a $01$-matrix, then there's $\frac{1}{2}$ chance for each number to be picked. Hence:
$P(A_m)=2\times({\frac{1}{2})^{k^2}}$
Now let's count how many such sub-matrices exist. To build a $k\times{k}$ sub-matrix from an $n\times{n}$ matrix, we choose $k$ rows and $k$ columns. That is there are ${n\choose{k}}{n\choose {k}}$ such matrices. So:
$$\sum P(A_m)=2\times(\frac{1}{2})^{k^2}{n\choose{k}}{n\choose {k}}$$ Now let's simplify that:
${n\choose{k}} = {\frac{n\times{(n-1)} \times{} ... \times{(n-k+1)}}{k!}} \approx \frac{n^k}{k^k} \approx 2^{{log(n)}^{k}} = 2^{klog(n)}$
$\sum P(A_m)\approx 2 \times 2^{-k^2} \times 2^{2klog(n)} = 2^{1-k^{2}+2klog(n)}$
To prove $P(A)<1$ we need to prove $1-k^{2}+2klog(n)<0$ : $$1-k^{2}+2klog(n)<0$$ $$1+2klog(n)<k^{2}$$ $$\frac{1}{k} + 2log(n) < k$$ $$1 + 2log(n) < k$$