The matrix is a scalar multiple of the identity transformation

75 Views Asked by At

Let $B \in GL(d, \mathbb{R})$. Assume that $\{v_k\}_{k \in \mathbb{N}}$ is eigenvectors (not necessarily linearly independent eigenvectors) of $B$ and $\lambda_k$ is the corresponding eigenvalues of $B$. Assume that $\{v_1, \ldots, v_N\}$ from a basis of $\mathbb{R}^d.$ Is it true that $B$ must be a scalar multiple of the identity transformation?

My proof: for any $l>N$, $v_l$ can be written as $$v_l = c_1v_1 +\ldots + c_Nv_N.$$

Applying $B$ on both sides and using the fact that each $v_k$ is an eigenvector of $B$ with eigenvalue $\lambda_k$, we get $\lambda_lv_l = \sum\limits_{i=1}^N c_i\lambda_iv_i$. Substituting the above equation for $v_l$ into $\lambda_lv_l$ and equating coefficients gives $\lambda_l = \lambda_1 = \ldots =\lambda_N$. This implies what I want.

I am a bit worried as some $c_i$ can be zero, and that makes some problems.

1

There are 1 best solutions below

12
On

No this is not true, you only have $B$ diagonalizable, i.e. there exists $P$ invertible such that $P B P^{-1}$ is a diagonal matrix. For example the matrix $$\begin{pmatrix} 1 & 0 \\ 0 & 2 \end{pmatrix}$$ Is not a multiple of the identity.

I don't know why you think that $ \forall l, \lambda_l v_l = \sum \lambda_i c_i v_i \implies \forall i,j, \, \lambda_i = \lambda_j$, it only gives you $c_{i} \in \{0,1\}$ and in fact $c_i = \delta_l^I$.

Plus $N = d$.