A negative correlation property in a random matrix

41 Views Asked by At

I am trying to prove the following negative correlation property. (where neither FKG or the BK inequality apply) Any input/idea is much appreciated:

Suppose each row of an $n\times n$ matrix is filled by a permutation over $1..n$ drawn independently uniformly at random. Fix the integers $d,k$ with $1\leq d < k\leq n$.

We say Event $E_i$ happens if all the first $d$ appearances of number $i$ in the matrix that occur in columns larger than $d$ also occur in rows larger than $k$ (where by first we mean smallest column index). The goal is proving the following negative correlation property: $$\Pr\{E_1\wedge\ldots\wedge E_m\}\leq \prod_{i=1}^m\Pr\{E_i\}$$

(There's an special case in the definition of $E_i$ that can be defined in two ways. In case there are fewer than $d$ appearances of number $i$ after column $d$, then the condition has to hold just on those appearances. Alternatively, you can exclude such matrices from $E_i$.)

Thanks a lot!