Is maximizing $x^H A^{-1} x$ in terms of $x$ equivalent to minimizing $x^H A x$ for symmetric positive definite $A$?
$x$ is variable of $N \times 1$ vector and $A$ is given square matrix of size $N \times N$. The vector $x$ must be constrained to have norm $||x||^2_2=1$ to prevent maximizing to be infinity.
Maybe $A$ could be decomposed into $PDP^{-1}$, so that $A^{-1} = PD^{-1}P^{-1}$, and it could be relevant to solving the proposition. Please answer me..