Spectrum of a random matrix with distinct distributions per row

26 Views Asked by At

Let $r(t)$ a Gaussian function with maximum $r_{\max} = r(\frac{n}{2})$ for $n \in \mathbb{N}$. Let $\sigma$ denote the amplitude of the Gaussian curve, so that higher $\sigma$ values imply a slower dissipation of the values of $r$ as $t$ becomes distant from $\frac{n}{2}$. This description is informal but should be sufficient for our purpose.

Let $\Omega^{n \times n}$ a random binary matrix with $\Omega_{ij}$ following a Bernoulli distribution with $p = r(i) \Delta t$, where $\Delta t$ is a small positive constant. By the density of a region $[i, j]$ of $\Omega$, with $i, j \in \mathbb{N}$ and $i < j$, we refer to the total number of $1$s in the rows $i, i + 1, i + 2, \ldots, j$.

$\Omega$ is a random matrix, and because $r$ has its peak at $n/2$, the rows whose index are close to this value are the denser rows of $\Omega$, with the density falling at a rate determined by $\sigma$ as we consider rows whose index is more distant to $n/2$.

I ran simulations of this random matrix and found an interesting result, which I don't quite understand. First of all, $\Omega$ always has a dominant eigenvalue $\lambda_{\max} \in \mathbb{R}$. This eigenvalue is always proportional to $r_{\max}$ and $\sigma$. In particular, it is always the case that

$$\lambda_{\max} \approx 3r_{\max}\sigma$$

All other eigenvalues are distributed in an ellipse with center $0$ on the complex plane. Most eigenvalues are $0$, or rather the kernel of $\Omega$ makes up most of the eigenspace. $\lambda_{\max}$ lies well beyond the ellipse - for example, for a particular configuration of $r_{max}$ and $\sigma$, no eigenvalue had a real part greater than $5$, while $\lambda_{max}$ was close to $90$.

At first, I thought this might be related to the Perron-Frobenius theorem. However, the theorem states that the eigenvector corresponding to the dominant eigenvalue of a random matrix that meets its assumptions is always positive. In my simulations, the eigenvector corresponding to $\lambda_{\max}$ is always negative.

I am interested in studying why is it that $\lambda_{\max}$ is related to $r_{\max}\sigma$, how this relationship can be described, and what properties correspond to its eigenvector. The literature I have found on the topic focuses too much on matrices of some special kind (for example, symmetric matrices). However, the assumptions I am allowed to make on $\Omega$ are restricted to the ones mentioned here. Furthermore, the literature I found on Bernoulli matrices considered matrices where every entry follows the same distribution. Here, each row follows a Bernoulli distribution, but each with a different probability of obtaining $1$.

Is there any literature describing at least something similar to this? How can I approach the study of the result described above?

Thanks in advance.