For part (b), I don't understand that given a vector v, if we already know the basis of a eigenspace $\{[1,0,0,1],[0,1,1,0]\}$, how can we know $\{v,[1,0,0,1]\}$ is also the basis of the eigenspace? and what about $\{v,[0,1,1,0]\}$?
2026-04-25 13:06:27.1777122387
On
Find an orthonormal basis for the eigenspace of a matrix containing a specific vector
1.4k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
0
On
Since $\{[1,0,0,1],[0,1,1,0] \}$is a basis, so is $\{[1,1,1,1],[1,0,0,1]\} $
Let r=[1,1,1,1], s=[1,0,0,1] Then $\{r,s-(r\bullet s/r\bullet r)r\} $ $$=\{[1,1,1,1],[1/2,-1/2,-1/2,1/2]\} $$ is an orthogonal basis by the Gram, Schmidt procedure. Normalizing the first vector, we obtain the desired orthonormal basis containing v, viz $$\{[1/2,1/2,1/2,1/2],[1/2,-1/2,-1/2,1/2]\} $$ The other part is done similarily.

We know that the eigenspace $E$ containing the eigenvector $\mathbf{v}$ is $$ E = \text{Span}\Bigg \{ \begin{bmatrix} 1 \\ 0 \\ 0 \\ 1 \end{bmatrix}, \begin{bmatrix} 0 \\ 1 \\ 1 \\ 0 \end{bmatrix} \Bigg \}. $$
We know want to find a vector, let's say $\mathbf{u}$, such that $\{ \mathbf{u}, \mathbf{v} \}$ is a basis for $E$. We know that this is the case if the following criteria hold.
If we let $\mathbf{u} = (1, 0, 0, 1)^{T}$, and since there is no such $c \in \mathbb{R}$ such that $c \mathbf{v} = \mathbf{u}$, we can conclude that $\mathbf{u}$ is linearly independent from $\mathbf{v}$. Since it is also clear that $\mathbf{u} \in E$, it follows that $\{ \mathbf{u}, \mathbf{v} \}$ is a basis for the eigenspace $E$.
As you suggest, it is also the case that $\{ \mathbf{u}, \mathbf{v} \}$ is a basis when we let $\mathbf{u} = (0, 1, 1, 0)^{T}$, since it is clearly an element of $E$, and clearly linearly independent of $\mathbf{v}$. There are in fact an infinite number of vectors that hold the criteria for vector the $\mathbf{u}$, we can just pick an element of the eigenspace $E$ that is linearly independent from $\mathbf{v}$.