Is the pdf of a matrix normal distribution equal to the pdf of its vectorization?

108 Views Asked by At

On Wikipedia, it says that a matrix normal distribution defined by

$$ \mathbf{X} \sim \mathcal{MN}_{n\times p}(\mathbf{M}, \mathbf{U}, \mathbf{V}) $$ if and only if $$ \mathrm{vec}(\mathbf{X}) \sim \mathcal{N}_{np}(\mathrm{vec}(\mathbf{M}), \mathbf{V} \otimes \mathbf{U}) $$.

My question is, will $p(\mathrm{vec}(\mathbf{X})) = p(\mathbf{X})$ always hold? Intuitively, I want to say yes, but I can't find a mathematical way to prove it.

Thanks

Context (Switched $\mathbf{X}$ to $\mathbf{W}$): In a machine learning task, I'm looking to calculate the posterior probability given a prior (which is a matrix normal distribution) and a likelihood (which is the product of many multivariate normal distributions) and I suspect that calculating $p(\mathrm{vec}(\mathbf{W}))p(\mathbf{T}|\mathbf{W})$ will be easier than calculating $p(\mathbf{W})p(\mathbf{T}|\mathbf{W})$ in order to obtain $p(\mathbf{W}|\mathbf{T})$, the distribution over the parameters of the model conditioned on the data.