Finding the dual basis of a vector space consisting of matrices

1.3k Views Asked by At

I know this is a very basic question, but it is giving me problems with the example a friend told me about.

Let $$\beta = \{\begin{pmatrix} 1 & 0 \\ 0 & 0\end{pmatrix}, \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix} \}$$

Its dual space has the basis $$\beta^* = \{\begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix}, \begin{pmatrix} 0 & 0 \\ 1 & 0\end{pmatrix}\}$$

I know the definition of dual space (and its basis), but if I do

$$e^1(e_2)= \begin{pmatrix} 1 & 0 \\ 0 & 0\end{pmatrix} \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix} = \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix} \neq 0 \cdot e_2$$

which I thought had to equal $\delta_{ij}$

I know my problem is my understanding of how to apply the functionals on the vectors.

2

There are 2 best solutions below

0
On BEST ANSWER

The dual basis is a set of linear forms so we have to address to each matrix a number. Here the two linear forms $$e^1\left(\begin{pmatrix}a_{11}&a_{12}\\ a_{21}&a_{22}\end{pmatrix}\right)=a_{11}+a_{21}\text{ and } e^2\left(\begin{pmatrix}a_{11}&a_{12}\\ a_{21}&a_{22}\end{pmatrix}\right)=a_{12}+a_{22} $$ make the dual basis, obviously $e^j(e_k)=\delta_{jk}$.

0
On

Preliminary note: all vector spaces considered are finite dimensional. The base field is $\mathbb{R}$.

It depends on what you consider the dual space of a vector space $V$. By definition, it is $V^*=\operatorname{Hom}(V,\mathbb{R})$ and, with this definition, certainly that set of matrices is not the dual basis.

On the other hand, if you have a nondegenerate bilinear map $\alpha\colon W\times V\to\mathbb{R}$, we can identify $W$ with $V^*$ in a unique way.

Nondegenerate means that

  • if $\alpha(w,v)=0$ for all $w\in W$, then $v=0$;
  • if $\alpha(w,v)=0$ for all $v\in V$, then $w=0$.

If $W=V^*$, the nondegenerate bilinear map exists: $\alpha(f,v)=f(v)$. Suppose instead $\alpha\colon W\times V$ is given. Then we can define a map $$ \hat{\alpha}\colon W\to V^* \qquad \hat{\alpha}(w)\colon v\mapsto \alpha(w,v) $$ Clearly, for every $w\in W$, $\hat{\alpha}(w)$ is a linear map $V\to \mathbb{R}$, so it belongs to $V^*$. Moreover $\hat{\alpha}\colon W\to V^*$ is easily seen to be linear.

This map is injective: indeed, $\hat{\alpha}(w)=0$ implies $\alpha(w,v)=0$ for all $v\in V$; by nondegeneracy, $w=0$. It remains to show that $\hat{\alpha}$ is surjective, which will follow from the fact that $\dim W=\dim V$. Injectivity of $\hat{\alpha}$ implies $\dim W\le\dim V^*=\dim V$. On the other hand, the situation is symmetric, so we can prove in the same way that $\dim V\le\dim W^*=\dim W$. Hence $\dim W=\dim V$ and so $\hat{\alpha}$ is surjective.

If $\{w_1,\dots,w_n\}$ and $\{v_1,\dots,v_n\}$ are bases for $W$ and $V$, we can say that they are dual to each other if, for all $i$ and $j$, $$ \alpha(w_i,v_j)=\begin{cases} 1 & i=j\\[4px] 0 & i\ne j \end{cases} $$ It is easy to prove that, in this case, $\{\hat{\alpha}(w_1),\dots,\hat{\alpha}(w_n)\}$ is indeed the dual basis for $\{v_1,\dots,v_n\}$ in $V^*$.

Now all it's needed is to find a nondegenerate bilinear map $$ \alpha\colon\left\langle \begin{pmatrix} 1 & 0 \\ 0 & 0\end{pmatrix}, \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix} \right\rangle \times \left\langle \begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix}, \begin{pmatrix} 0 & 0 \\ 1 & 0\end{pmatrix} \right\rangle\to \mathbb{R} $$ Define $$ \alpha(A,B)=\operatorname{trace}(A^TB) $$ and prove this is indeed nondegenerate. Also $$ \alpha\left( \begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix}, \begin{pmatrix} 1 & 0 \\ 0 & 0\end{pmatrix} \right)= \operatorname{trace}\begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix}=1 $$ and $$ \alpha\left( \begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix}, \begin{pmatrix} 0 & 0 \\ 1 & 0\end{pmatrix} \right)= \operatorname{trace}\begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix}=0 $$ Do the other check. Therefore your friend is right, but talking about the dual basis is not really correct: it is the dual basis in that particular identification.