Eigenvalue and informations?

53 Views Asked by At

I have been wondering about this these last couple of days and cannot seem to understand how come this is the case..

I know about the eigevalues and the eigenvector, and that it can be found on for matrix and a vector. How come is it useful.. I my computer vision class there is this method called Shi - Tomasi Corner detection, in which you have a Harris matrix

\begin{equation} \begin{bmatrix} (I_{x})^2 & I_{xy}\\ I_{xy} & (I_{y})^2 \end{bmatrix} \end{equation}

In which ex. $I_x$ means differentiated in respect to the x direction. A corner is defined as a large variation in the x and y direction, but how come is the eigenvalues helpful here? How do they inform about this?

I may have the solution but would like to have it confirmed somehow. For eigenvalues does it apply that \begin{equation} det(\lambda\begin{bmatrix}1 & 0 \\ 0& 1\end{bmatrix} - \begin{bmatrix} (I_{x})^2 & I_{xy}\\ I_{xy} & (I_{y})^2 \end{bmatrix}) = 0 \end{equation}

So..

\begin{equation} det(\begin{bmatrix}\lambda & 0 \\ 0& \lambda\end{bmatrix} - \begin{bmatrix} (I_{x})^2 & I_{xy}\\ I_{xy} & (I_{y})^2 \end{bmatrix}) = 0 \end{equation}

\begin{equation} det(\begin{bmatrix}\lambda-(I_{x})^2 & 0-(I_{xy}) \\ 0-(I_{xy}) & \lambda -(I_{y})^2\end{bmatrix} =0 \end{equation}

\begin{equation} (\lambda-(I_{x})^2)(\lambda-(I_{y})^2)-(0-I_{xy})^2 = 0 \end{equation} \begin{equation} (\lambda-(I_{x})^2)(\lambda-(I_{y})^2)+(I_{xy})^2 = 0 \end{equation}

but how does this help? It is a second order equation, how does say anything...