I need to show that the approximated Hessian in the Levenberg-Marquardt algorithm is guaranteed to be invertible, whereas in the Gauss-Newton algorithm, this is not always required to be true.
Levenberg-Marquardt uses $$H^{-1} \approx (J^TJ + \mu I)^{-1}$$ Gauss-Newton uses $$H^{-1} \approx (J^TJ)^{-1}$$
I am familiar with basic linear algebra and came up with the following ideas so far: Surely, $J^TJ$ is a symmetric matrix. Also, all its elements on the diagonal are $\geq 0$. Now, if we add the identity matrix with a small factor $\mu>0$ to $J^TJ$, all elements on the diagonal will be $>0$.
I think this somehow points to positive definiteness of $(J^TJ+\mu I)$, since positive definiteness guarantees invertibility. However, the criterion for positive definiteness of a matrix A is $x^TAx > 0$ and I just can't think of any way to prove this. I tried it with small $2\times2$ matrices on paper, but reached no conclusion.
I also though about using the fact that a symmetric matrix is invertible if all of its eigenvalues are $\neq 0$ (as then it has full rank), but had difficulties to show that as well.
Any help or pointers into the right direction greatly appreciated!