Let $\mathbf{M} = [m_{ij}]$ be a symmetric matrix of size $m\times m$ of real elements. Let $\mathbf{A} = [a_{ij}^R + ia_{ij}^I]$ be a random Hermitian matrix whose elements have variance, $\sigma^2$, and mean $m_{ij}$, respectively i.e mean matrix is same as $\mathbf{M}$. Next, let $\mathbf{B}= [b_{ij}]$ be a random matrix of real elements with mean $m_{ij}$ and variance $\frac{\sigma^2}{2}$. Using monte carlo, I found that the eigenvalue distribution of $\mathbf{A}$ and $\mathbf{B}$ only differ by a multiplying constant. i.e. if $p_{\Lambda}(\mathbf{X}_A)$ and $p_{\mu}(\mathbf{X}_A)$ are the eigenvalue density function of $\mathbf{A}$ and $\mathbf{B}$ then,
$p_{\Lambda}(\mathbf{X}_A) = K$ $p_{\mu}(\mathbf{X}_A)$ where $K$ is a constant. How to prove it analytically? Here $\mathbf{X}_A$ is a set of eigenvalues $x_1,x_2,\cdots,x_m$.