I have to compute $v^TH^{-1}v$ where $H$ is a very big matrix (which is a Hessian, thus symmetric and positive-definite). It occurred to me that I can do an approximation by assuming $v$ is an eigenvector of $H$ and then $v^TH^{-1}v\approx\frac{|v|^4}{v^THv}$, bypassing the matrix inversion. This uses the fact that if $v$ has eigenvalue $\lambda$ for $H$ it has eigenvalue $1/\lambda$ for $H^{-1}$, then: $v^TH^{-1}v=\frac{|v|^2}{\lambda}=\frac{|v|^2}{(v/|v|)^TH(v/|v|)}=\frac{|v|^4}{v^THv}$.
Is this approximation reasonable? If so, does it have a name? When is it good/bad compared to assuming a diagonal H?
You can write $v=\sum_{i} a_i v_i$ where $v_i$ are eigenvector of $H$ (and in particular orthonormal) with corresponding eigenvalue $\lambda_i$. It is known that $Hv_i=\lambda_i v_i$ and $H^{-1}v_i= H^{-1} \frac{H v_i}{\lambda_i}=\frac 1 {\lambda_i} H^{-1}H v_i=\frac{1}{\lambda_i} v_i$. We also know that $v_i^T v_j=1(i=j)$. Putting all that thogether \begin{align*} v^T H^{-1} v &=\sum_j\sum_i a_i a_j v_j^TH^{-1} v_i\\ &=\sum_j\sum_i \frac{a_i a_j}{\lambda_i} v_j^T v_i\\ &=\sum_i \frac{a_i^2}{\lambda_i}\\ \end{align*} which may help you do your estimation. In particular if you keep only eigen vector, you will keep exactly one term of this sum, you may want to keep the one for which $\frac{a_i^2}{\lambda_i}$ is the biggest (this is in the spirit of what @user619894 is suggesting but puts a value on the notion of being "close"). The more terms you include the better it gets.