Proving $\dim(E_0) \geq n - k$

72 Views Asked by At

I found a question from an old exam which I am not really able to wrap my head around. The questions states:

Given $k<n$ and $v_1, v_2, ..., v_k \in \mathbb{R} ^n$ non-zero vectors, orthogonal to the standard dot product on $\mathbb{R}^n$, where we consider the vectors of $\mathbb{R}^n$ as column vectors. Given that $\lambda_1, \lambda_2, ..., \lambda_k \in \mathbb{R}$ and $$A = \lambda_1 v_1 v_1^T + \lambda_2 v_2 v_2^T + ... + \lambda_k v_k v_k^T \in \mathbb{R}^{n \times n}$$

Part A asks to prove that $v_1, v_2, ..., v_k $ are eigenvectors of $A$. I managed to prove this by showing that $$Av_i = \left(\sum_{j=1}^k\lambda_jv_jv_j^T\right)v_i=\sum_{j=1}^k\lambda_jv_j(v_j^Tv_i)=\lambda_iv_i|| v_i||^2=\left(\lambda_i|| v_i||^2\right)v_i$$ $\forall i = 1, 2, ..., k$

Part B asks to prove that $\dim(E_0) \geq n - k$, where $E_0$ the eigenspace is for the eigenvalue $0$. I don't really have a direct idea of how to get started with this part.

Part C asks to prove that $A$ is diagonalisable and give an orthogonal basis of $A$. Again I am not really sure how to get started with this.

Any help is appreciated.

2

There are 2 best solutions below

0
On BEST ANSWER

For Part B, you can find an orthogonal system $$v_{k+1},v_{k+2},\ldots, v_n\ ,$$ which is orthogonal to $\{v_1,v_2,\ldots, v_k\}$ (e.g. by Gram-Schmidt). We can see $Av_{i}=0$ for $k<i\le n$. From this, deduce that $\dim E_0\ge n-k$.

Combining the Part A and B, we can see immediately there exists an orthogonal basis consisting of eigenvectors of $A$.

Note: Spectral decomposition theorem says every symmetric real matrix is of the same form as that of $A$.

0
On

Part B: observe that "the eigenspace for the eigenvalue $0$" is just a fancy way of saying "null space". Both are the set of all vectors $v$ for which $Av=0$. I'm assuming you're familiar with computing null spaces?

Part C: a matrix is diagonalizable if the sum of dimensions of its eigenspaces equals the number of columns. For example, a $3\times 3$ matrix will be diagonalizable if it has two eigenvalues, and the corresponding dimensions of the eigenspaces are $1$ and $2$. If we know that a matrix is diagonalizable, diagonalizing it amounts to finding all of the eigenvalue/eigenvector pairs. You can then use Gram-Schmidt to get an orthogonal basis.