A square matrix (XX^T) made from column vector X always has an eigen vector as X

1.7k Views Asked by At

I was asked in an interview that

Given a column vector $X$ of order $n\times1$, when we obtain the matrix $XX^T$ (order $n\times n$), give at least one of the eigenvectors of the matrix $XX^T$ without any calculation, just by looking

Observing that all the columns of the matrix $XX^T$ are dependent on the column vector $X$, thus there is only one independent vector basis in the column space of $XX^T$, thus rank of $XX^T$ is $1$. Hence in row echelon form of $XX^T$, there is only one row and thus, $XX^T$ has $n-1$ eigenvalues as $0$ and one eigenvalue is non-zero.

With their help I was able to conclude from trace property of a matrix that the non-zero eigenvalue was (sum of square of each element in column vector $X$).

But eigenvector I could not found during interview but by their surprised look, it was evident that The eigenvector was hiding in plain sight and I could not see!

Later taking examples, I found $X$ to be an eigenvector.

I could prove it by brute force taking general $(n\times1)$ column vector and then constructing $XX^T$ with all columns dependent

then the product $XX^T\times X$, will have in first row of product a multiple of first row of any of the column and similarly in all rows, and I could show that the eigenvector $X$ is corresponding to non-zero eigenvalue

But, is there a CLEAN method, like from vector spaces geometric interpretation or property of $XX^T$ or anything much more conceptual than brute force general example taking?

2

There are 2 best solutions below

3
On BEST ANSWER

Note that for any vector $v$, $(XX^T)v=X(X^Tv)$, and $X^Tv$ is just the inner product of $X$ with $v$. So $XX^T$ represents the following linear transformation: given a vector $v$, take its inner product with $X$ to get a scalar, and then multiply that scalar by $X$. This makes it immediately clear that $XX^T$ has rank $1$ and that its image is the subspace spanned by $X$, and so $X$ is (up to scalar multiples) the only eigenvector with a nonzero eigenvalue.

Geometrically, if $X$ is a unit vector, then this is just the orthogonal projection onto the span of $X$: if you pick an orthonormal basis with $X$ as one of the basis vectors, then the inner product $X^Tv$ represents the coefficient of $X$ when you write $v$ in terms of this basis, and then multiplying that by $X$ gives just the $X$ term of $v$. In general, you can think of $X$ as a unit vector scaled by some scalar $c$, and then $XX^T$ will just be the orthogonal projection onto the span of $X$ scaled by $c^2$ (since you scaled both $X$ and $X^T$ by $c$ from a unit vector).

1
On

By matrix associativity, $(\color{red}{X}\color{blue}{X^T})\color{limegreen}{X}=\color{red}{X}(\color{blue}{X^T}\color{limegreen}{X})=(\color{blue}{X}\cdot\color{limegreen}{X})\color{red}{X}$. Or in Einstein notation, $\color{red}{X_i}\color{blue}{X_j}\color{limegreen}{X_j}=(\color{blue}{X_j}\color{limegreen}{X_j})\color{red}{X_i}$.