Can any one can refer me to a paper discussing how to find the max eigenvalue distribution for the following matrix where all the $\lambda$'s are random variables
\begin{align} \small %%%%% M := \begin{bmatrix} %% 0 & \lambda_1 & 0 &0 & \dots &0&0 \\ %% \lambda_1 & 0 & \lambda_2 & 0 & \dots &0&0 \\ %% 0& \lambda_2 & 0 & \lambda_3 & \dots &0&0 \\ %%% & &&\vdots \\ %%% 0& 0 & 0 & \dots &\lambda_{n-1} & 0& \lambda_{n } \\ %%% 0& 0 & 0 & \dots &0 & \lambda_{n } & 0 %%% \end{bmatrix}. \end{align}
Thanks
Short answer. There is currently no general approach to tackle the problem you describe for any tridiagonal random matrix. You might have a look at the paper "Sturm Sequences and Random Eigenvalue Distributions" by James T. Albrecht, Cy P. Chan, and Alan Edelman for an elegant approach based on Sturm sequences. There, explicit formulas are derived for a special type of tridiagonal random matrix (of the $\beta$-Hermite type). You might hope to readapt their approach to your needs, but it might be a lot of work.