From the fact that inversion operator $A \to A^{-1}$ is convex over the set of positive definite matrices. I would like to know if it is correct to use Jensen's inequality with the standard Lowner order. I.e
$$ (E(X))^{-1} \leq E(X^{-1}) $$
Does the following equation also hold?
$$\lambda_{max}(E(X)) \leq E(\lambda_{max}(X))$$ where $\lambda_{max}$ stands for the maximum eigenvalue of a matrix.
***Question 1
Here is a positive answer to the first question. Let $\mathcal{P}_n$ be the set of all symmetric and positive definite $n \times n$ matrices. Note that if $A \in \mathcal{P}_n$ then $A^{-1}$ exists and $A^{-1} \in \mathcal{P}_n$. For $A$ and $B$ in $\mathcal{P}_n$ we write $A\leq B$ if and only if $B-A$ is symmetric and positive semi-definite.
Claim:
If $X \in \mathcal{P}_n$ and if $E[X]$ and $E[X^{-1}]$ are finite then $E[X]^{-1}$ exists and $$ E[X^{-1}] \geq E[X]^{-1}$$
Proof:
Let's use the following facts:
Fact 1: If $A$ and $B$ are in $\mathcal{P}_n$ then for all $\theta \in [0,1]$ we have $\theta A + (1-\theta) B \in \mathcal{P}_n$ and $$ (\theta A + (1-\theta) B)^{-1} \leq \theta A^{-1} + (1-\theta)B^{-1}$$
Fact 2: Fix $m$ as a positive integer. If $\mathcal{Y} \subseteq \mathbb{R}^m$ and $Y$ is a random vector that takes values in $\mathcal{Y}$ has finite expectation $E[Y]$, then $E[Y] \in Conv(\mathcal{Y})$ (denoting the convex hull of $\mathcal{Y}$).
Now define the set $\mathcal{Y}$ as the set of all pairs of $n\times n$ matrices $(R, S)$ such that $R \in \mathcal{P}_n$, $S \in \mathcal{P}_n$, and $S \geq R^{-1}$. From Fact 1 it is easy to show that the set $\mathcal{Y}$ is a convex set. [Indeed if $(R, S)$ and $(A, B)$ are two elements in $\mathcal{Y}$ then for any $\theta \in [0,1]$ we have $\theta R + (1-\theta)A \in \mathcal{P}_n$ and since $S \geq R^{-1}$ and $B \geq A^{-1}$ we get $$ (\theta S + (1-\theta)B) \geq (\theta R^{-1} + (1-\theta)A^{-1}) \geq (\theta R + (1-\theta)A)^{-1} $$ where the inequality holds by Fact 1.]
The set $\mathcal{Y}$ can viewed as a convex subset of $\mathbb{R}^{2n^2}$. Now notice that if $X$ is a random matrix in $\mathcal{P}_n$ then $(X, X^{-1}) \in \mathcal{Y}$ and so, assuming the expectations $E[X]$ and $E[X^{-1}]$ are finite, we get by Fact 2: $$ E[(X, X^{-1})] \in Conv(\mathcal{Y}) = \mathcal{Y}$$ where equality holds because $\mathcal{Y}$ is a convex set. That is $$ (E[X], E[X^{-1}]) \in \mathcal{Y}$$ and by definition of $\mathcal{Y}$ this means that $E[X] \in \mathcal{P}_n$ and $E[X^{-1}] \geq E[X]^{-1}$. $\Box$.
**Question 2
Let $||y||=\sqrt{y_1^2 + ... + y_n^2}$. For any random square matrix $A$ we have by convexity of $\sup$ and of $||\cdot||$:
\begin{align} E\left[ \sup_{||x||\leq 1} || Ax|| \right] &\geq \sup_{||x||\leq 1} E[||Ax||] \\ &\geq \sup_{||x||\leq 1} ||E[A]x|| \end{align} So the expected largest singular value of the random matrix $A$ is greater than or equal to the largest singular value of $E[A]$. If $A$ is a random symmetric positive semi-definite matrix then $E[A]$ is also, and the largest singular value corresponds to the largest eigenvalue.