A trick in Linear algebra

248 Views Asked by At

Let $V$ denote a finite dimensional vector space with inner product $\langle\cdot, \cdot\rangle$, and $\alpha_1,\alpha_2,\cdots,\alpha_r,\beta_1,\beta_2,\cdots,\beta_r\in V$, Suppose there exists a nonzero $\alpha\in V$, such that: $$ \sum_{i=1}^r \langle\alpha,\alpha_i\rangle \beta_i=0 $$ Then prove a very symmetric result:

There exists a nonzero $\beta\in V$,such that: $$ \sum_{i=1}^r \langle\beta,\beta_i\rangle\alpha_i=0 $$

It’s obviously true from my instinct since in this problem $\alpha$ and $\beta$ has an equal position, and i think it may be use contradiction to prove this, but i have no ideal how to do exactly

Thanks in advance for any help!

1

There are 1 best solutions below

4
On BEST ANSWER

I assume that the inner product is linear in the first argument. This comes with no loss in generality, as otherwise the two equations above can be conjugated with respect to a chosen basis.

The linear map on $V$ with action $v \mapsto \sum_{i = 1}^{r} < v, \alpha_{i} > \beta_{i} $ can be represented by a linear operator $ T = \sum_{i = 1}^{r} \beta_{i} \alpha_{i}^{*}$. The adjoint of this operator is $ T^{*} = \sum_{i = 1}^{r} \alpha_{i} \beta_{i}^{*}$. $T$ has non-zero kernel if and only if $T^{*}$ has non-zero kernel.