Let $(x_\alpha )_{\alpha \in A}$ be a basis for a vector space E and consider a vector $ a= \sum_\alpha \xi^{\alpha}x_\alpha$

32 Views Asked by At

Let $(x_\alpha )_{\alpha \in A}$ be a basis for a vector space E and consider a vector $ a= \sum_\alpha \xi^{\alpha}x_\alpha$

Suppose that for some $ \beta \in A$, $\xi^{\beta} \not= 0$. Show that the vector $(x_\alpha )_{\alpha \not= \beta}$ , a form again a basis for E.

1

There are 1 best solutions below

0
On BEST ANSWER

The first thing we observe is that if $\xi_{\beta}\neq 0$, we have that \begin{equation*}\label{1} x_{\beta}=\frac{1}{\xi_{\beta}}a+\sum_{\alpha\neq\beta}\frac{\xi_{\alpha}}{\xi_{\beta}}x_{\alpha}. \end{equation*}

We need to prove that $\gamma=\{a,(x_{\alpha})_{\alpha\neq\beta}\}$ is a basis for $E$. First we are going to prove that every vector can be seen as a linear combination of $\gamma$. Let $v\in E$. Since $(x_{\alpha})_{\alpha\in A}$ is a basis we have that \begin{equation*} v=\sum_{\alpha}v_{\alpha}x_{\alpha}. \end{equation*} Substituting from the first equation we obtain \begin{equation*} v=\frac{v_{\beta}}{\xi_{\beta}}a+\sum_{\alpha\neq\beta}\left( \frac{\xi_{\alpha}}{\xi_{\beta}}+v_{\alpha} \right) x_{\alpha}. \end{equation*} Thus, $\text{spann}(\gamma)=E$.

Now we are going to prove that $\gamma$ is linear independent. Let's take a linear combination equal to zero: \begin{equation*} 0=\lambda a+\sum_{\alpha\neq\beta}\lambda_{\alpha}x_{\alpha}. \end{equation*} Using again the first equation we get \begin{eqnarray} 0&=&\lambda\sum_{\alpha}\xi_{\alpha}x_{\alpha}+\sum_{\alpha\neq\beta}\lambda_{\alpha}x_{\alpha},\\ &=&\lambda\xi_{\beta}x_{\beta}+\sum_{\alpha\neq\beta}(\lambda\xi_{\alpha}+\lambda_{\alpha})x_{\alpha}. \end{eqnarray} Since $(x_{\alpha})_{\alpha\in A}$ is linear independent and $\xi_{\beta}\neq0$, then $\lambda=0$ and $\lambda_{\alpha}=0$.

Thus, $\gamma$ is a basis.