In Linear regression, we have $\textbf{$\theta$} = \textbf{$\left (X^TX\right )^{-1} X^T y$}$.
In Ridge regression, we have $\textbf{$\theta$} = \textbf{$\left ( \lambda I+ X^TX\right )^{-1} X^T y$}$.
I learnt somewhere that while $\textbf{$X^TX$}$ is not guaranteed to be invertible, $\textbf{$\lambda I + X^TX$}$ is guaranteed to be invertible.
Is this true? If so, why?
Hint: Can you write the eigenvalues of $\lambda I + X^\top X$ in terms of the eigenvalues of $X^\top X$?