I am working on the computational analysis of the eigenvalue spectra of real symmetric random matrices. At the moment my approach is fully empirical but I would like to develop a rough model of the eigenvalue spectrum as a function of matrix size (for NxN matrices).
In particular, I am interested in setting a probability upper bound (as close as possible to the l.u.b.) for the largest eigenvalue falling within lambda -> lambda + delta_lambda.
The entries of such matrices are i.i.d. random variables (say gaussian).
My first idea was to look at the pdf (with N as a parameter) for Tr(matrix). In fact we know that the trace being larger than N*(lambda) is a sufficient condition for the largest eigenvalue being larger than lambda, by Pidgeon-hole principle. However, the fact that such condition is not necessary means we are underestimating the probability upper bound.
Are there alternative conditions for the largest eigenvalue value whose characteristic pdf is reasonably easy to obtain and at the same gives a good approximation - does not need to be exact - of the overall probability least upper bound?
Thanks in advance.