Fix $k>0$ and let $X, Y$ be two vertex sets of size $n$ a positive integer (we're interested in the limit $n\to \infty$). Define a random bipartite graph on $X \sqcup Y$ in an Erdos-Renyi fashion by putting an edge between each pair $x, y$ with $x\in X$, $y\in Y$ with probability $\frac{k}{n}$. Define the discrepancy of the resulting bipartite graph as the maximum of $$\left|\frac{E(A,B)}{kn}-\frac{|A||B|}{n^2}\right|$$ over all the subsets $A \subset X$, $B\subset Y$, where $E(A, B)$ is the number of edges between vertices in $A$ and vertices in $B$.
Is it true that, for every $\epsilon>0$, as $n \to \infty$ the probability that the discrepancy is greater than $\epsilon$ go to zero?
In other words, do all pairs of subsets have roughly the expected number of edges between them with high probability?
In case the statement is true, I'm also interested in the natural generalization to $r$-partite $r$-uniform hypergraphs. That is, fix vertex sets $X_1, \ldots, X_r$ and put an edge between $x_1, \ldots, x_r$ for each $x_1 \in X_1, \ldots x_r\in X_r$ with probability $\frac{k}{n^{r-1}}$. Define the discrepancy as the maximum of $$\left|\frac{E(A_1,\ldots, A_r)}{kn}-\frac{|A_1|\cdots|A_r|}{n^r}\right|$$ over all the $r$-tuples of subsets $(A_i \subset X_i)$, where $E(A_1, \ldots, A_r)$ is the number of edges between $A_1, \ldots A_r$. Is it true that, for every $\epsilon>0$, as $n \to \infty$ the probability that the discrepancy is greater than $\epsilon$ go to zero in the hypergraph case?
Edit : I'm pretty sure that any approach that simply try to bound $\mathbb{P}\left(\left|\frac{E(A,B)}{kn}-\frac{|A||B|}{n^2}\right|>\epsilon\right)$ independently for each pair $A,B$ and then use the union bound on all possible pairs $A, B$ cannot work. Indeed, for this method to work, we would need $\mathbb{P}\left(\left|\frac{E(A,B)}{kn}-\frac{|A||B|}{n^2}\right|>\epsilon\right)$ to decrease as least as fast as $2^{-2n}$. On the other hand, using second to last inequality here for the lower tail of the binomial distribution, we find that $$\mathbb{P}\left(\left|\frac{E(A,B)}{kn}-\frac{|A||B|}{n^2}\right|>\epsilon\right)\ge \frac{1}{\sqrt{2N}}\exp(-ND(p-\frac{pn^2\epsilon}{N}\parallel p))$$ where $p=\frac{k}{n}$, $N=|A||B|$ and $D(\cdot\parallel\cdot)$ is the Kullback–Leibler divergence. We can restrict to the pairs satisfying $|A|,|B|\ge \frac{n}{2}$ (this is one fourth of all the pairs $A, B$), in which case $\frac{n^2}{4}\le N \le n^2$ and we get a lower bound of
$$\mathbb{P}\left(\left|\frac{E(A,B)}{kn}-\frac{|A||B|}{n^2}\right|>\epsilon\right)\ge\frac{1}{\sqrt{2n^2}}\exp(-n^2D(p-4p\epsilon\parallel p))$$
Now, using that $\frac{d^2}{dx^2}D(x\parallel p)=\frac{1}{x(1-x)}$, we can easily get the rough estimate $D(p-x \parallel p)\le \frac{1}{2}\frac{1}{\frac{p}{2}(1-\frac{p}{2})}x^2\le \frac{2x^2}{p}$ for $x \in [0,\frac{p}{2}]$, so (assuming $\epsilon$ small enough), $$D(p-4p\epsilon\parallel p)\le 32p\epsilon^2$$ so $$\mathbb{P}\left(\left|\frac{E(A,B)}{kn}-\frac{|A||B|}{n^2}\right|>\epsilon\right)\ge \frac{1}{\sqrt{2n^2}}\exp(-32p\epsilon^2n^2)=\frac{1}{\sqrt{2n^2}}\exp(-32k\epsilon^2n)$$ We see that for $\epsilon$ small enough, this decreases slower than $2^{-2n}$, so the naive approach cannot work. This mean that, if the statement is true, it requires a more clever idea than just the union bound on all pairs $A, B$.