There is a paper that I am reading and the following has been considered without proof: (Suppose $\lambda(.)$ defines the spectrum of a matrix and one can define a random variable on this spectrum say $Z$ for a given matrix $A$.): $$\mathbb{E}\lambda(A(A+\nu I)^{-1})=\mathbb{E}\frac{Z}{Z+\nu}$$
I don't know how to prove it but I know the following fact:
For a given matrix $A$, and given polynomial $p(x)$, we have: $\lambda(p(A))=\{p(\lambda_1),p(\lambda_2),p(\lambda_3),p(\lambda_4),...\}$ (As multiset)
How can I proceed to prove something general for the polynomials of a matrix and its inverses up to summations by identity? Or even this special case?
If $v$ is a vector from the eigenspace $E_\alpha$ of $A$ then $Av=\alpha v$ and $(A+\nu I)v=(\alpha+\nu)v$ that is $v=(\alpha+\nu)(A+\nu I)^{-1}v$ for $\nu\notin-\sigma(A)$. Finally $$\forall\, v\in E_\lambda,\qquad A(A+\nu I)^{-1}v=\dfrac{\alpha}{\alpha+\nu}v$$ This means that on $\{Z=\alpha\}$, $Z$ takes the value $\lambda(A(A+\nu I)^{-1})=\dfrac{\alpha}{\alpha+\nu}$.Thus $$\mathbb{E}(\lambda(A(A+\nu I)^{-1}))=\sum_{\alpha\in\sigma(A)}\dfrac{\alpha}{\alpha+\nu}\mathbb{P}(Z=\alpha)=\mathbb{E}\left(\frac{Z}{Z+\nu}\right).$$