Let $A$ be an invertible $n\times n$ matrix. If $$Av = \lambda v \qquad (1)$$ for some $v$ and $\lambda$ then $\lambda$ is an eigenvalue of $A$ and $v$ a corresponding eigenvector. Equation $(1)$ may be written as $$Av = \lambda v = \lambda Iv \quad\Leftrightarrow\quad Av - \lambda I v = 0 \quad\Leftrightarrow\quad (A - \lambda I)v = 0$$ where $I$ is the identity matrix. The last equation can be used to find correct $\lambda$'s via the fundamental theorem of algebra (first take determinant).
On the other hand, it is also possible to write $$Av = \lambda v \quad\Leftrightarrow\quad A^{-1}Av = A^{-1}\lambda v \quad\Leftrightarrow\quad Iv - A^{-1}\lambda v = 0 \quad\Leftrightarrow\quad (I - A^{-1}\lambda)v = 0$$ since it was assumed $A$ is invertible.
Now, since $(A - \lambda I)v = 0$ and $(I - A^{-1}\lambda)v = 0$ it must be that $$A - \lambda I = I - A^{-1}\lambda$$ which connects the matrix $A$ to its inverse via the eigenvalues of $A$.
I did some calculations and it seems this works for some invertible matrices $A$ but not all. Am I doing a mistake somewhere?
Your last step is wrong. If you have two matrices $A,B$ and a non-zero vector for which $Av = Bv = 0$ it doesn't mean that $A = B$. What is true is that if $v$ is an eigenvector of $A$ associated to the eigenvalue $\lambda$ and $A$ is invertible then $v$ is also an eigenvector of $A$ associated to the eigenvalue $\lambda^{-1}$ as follows from:
$$ Av = \lambda v \implies v = A^{-1}Av = \lambda A^{-1}v \implies A^{-1}v = \lambda^{-1} v. $$