Multiplication of two random matrices over a finite field

354 Views Asked by At

Consider a matrix $\mathrm{X}$ sampled uniformly at random from the set of all rank $r$ matrices over $\mathbb{F}_q^{m \times n}$ and a matrix $\mathrm{Y}$ sampled uniformly at random from the set of all full rank matrices over $\mathbb{F}_q^{n \times n}$. Let $\mathrm{Z} = \mathrm{X}\mathrm{Y}$.

I am trying to prove something like the statement that $\mathrm{Z}$ is uniform over the set of all rank $r$ matrices over $\mathbb{F}_q^{m \times n}$.

Assume $m > n$ and $r \leq n$.


This old stackexchange answer says that this is indeed the case, but it does not explain how to reach the result. Is this a common result in random matrix theory, and if so, can someone provide a reference or a proof?

2

There are 2 best solutions below

2
On BEST ANSWER

The claim is relatively well-known and even true for the more general case, where $P(Y \text{ is invertible}) = 1$, i.e., $Y$ does not have to be uniform. We assume that $X$ and $Y$ are independent and $P(X=x)=p$ uniformly over all matrices $x$ of rank $r$.

A proof goes as follows. Let $z \in \mathbb{F}_q^{m\times n}$ be a matrix of rank $r$. Start with demarginalization to obtain $$ P(Z=z)=\sum_{y} P(Y=y,Z=z) = \sum_{y} P(Z=z|Y=y)P(Y=y), $$ where the sum ranges over all invertible matrices $y$. Now, since $y$ is invertible, $z=xy$ if and only if $x=zy^{-1}$ and hence $P(Z=z|Y=y) = P(X=zy^{-1}|Y=y)$. Due to independence of $X$ and $Y$ we get $P(X=zy^{-1}|Y=y)=P(X=zy^{-1})=p$, as $zy^{-1}$ is a matrix of rank $r$. Consequently, $$ P(Z=z) = \sum_{y} p P(Y=y) =p\sum_{y} P(Y=y) = p, $$ which proves the claim.

0
On

We can generalize this to random variables and other group actions too:

Proposition. If $G$ is a finite group, $\Omega$ a finite set, $G\curvearrowright\Omega$ is a group action, $g$ is a $G$-valued random variable, and $\alpha$ a uniform $\Omega$-valued random variable, then $g\alpha$ is also a uniform $\Omega$-valued RV.

Proof. The probability $g\alpha$ equals a particular element $\omega\in\Omega$ is the sum of all $\rho_G(g)\rho(\alpha)$ for which $g\alpha=\omega$, where $\rho_G,\rho_\Omega$ are mass functions and $\rho_\Omega(\alpha)=\frac{1}{|\Omega|}$. How many times does each $\rho_G(g)$ appear in this sum? For each group element $g\in G$ there is exactly one $\alpha$ for which $g\alpha=\omega$, so basically we sum $\rho_G(g)\frac{1}{|\Omega|}$ once for each $g\in G$. But $\sum_{g\in G}\rho_G(g)=1$, so this winds up simply being $1/|\Omega|$.

This case has $G=\mathrm{GL}_n\mathbb{F}_q$ and $\Omega$ the set of rank-$r$ $\,m\times n$ matrices over $\mathbb{F}_q$, although this case uses a right group action instead of a left group action (but the logic is the same).

For fun, you can also try a more long-winded proof using the orbit-stabilizer theorem, the fact conjugate elements have conjugate stabilizers, the fact "transporters" are cosets of stabilizers, etc.