Can we show it without involving that $V=V^{**}$ are canonically isomorphic?

281 Views Asked by At

My text proves the following

Theorem. Let $V$ be a vector space over $F$ and $B=\{ v_1, \ldots , v_n \}$ a basis of $V$. Then there is exactly one basis $B^*=\{ f_1, \ldots , f_n \}$ of $V^*$ with the property $f_i(v_j)=\delta_{ij}$.

And then gives the following

Exercise. Let $V$ be a vector space over $F$ and $B^*=\{ f_1, \ldots , f_n \}$ a basis of $V^*$. Show that there is exactly one basis $B=\{ v_1, \ldots , v_n \}$ of $V$ with the property $f_i(v_j)=\delta_{ij}$.

In other sources, I've read about $V$ and $V^{**}$ being canonically isomorphic. But my text doesn't discuss it at all. So I think it's quite unlikely that the author expects the readers to discover and use it on their own.

Can we show the exercise straight from the theorem without involving the $V=V^{**}$ business?

Here is what I've tried:

Let $\{ w_1, \ldots , w_n \}$ be a basis of $V$. By the theorem there is exactly one basis $\{ g_1, \ldots , g_n \}$ of $V^*$ with the property $g_i(w_j)=\delta_{ij}$. Let $A$ be the basis transformation matrix, s.t. $f_k=a_{k1}g_1+\ldots+a_{kn}g_n$. Now I was hoping to define $\{ v_1, \ldots , v_n \}$ from $\{ w_1, \ldots , w_n \}$ by using $A$ and then show that $f_i(v_j)=\delta_{ij}$ but it didn't work out.

1

There are 1 best solutions below

0
On

For the sake of not leaving this question unanswered. You may take the following approach: Let $H_i = {\rm span\,}\{f_1,\dotsc,\widehat{f_i},\dotsc,f_n\}$ (where $\widehat{f_i}$ means omitting $f_i$). Then clearly $\dim H_i = n-1$. Now, consider $V_i:= \{x\in V\,|\, f(x) = 0 \text{ for all $f\in H_i$}\}$.

Lemma: Let $V$ be a finite dimensional vector space and $x\in V$. If $f(x) = 0$ for all $f\in V^*$, then $x=0$.
Proof: Assume $x\neq 0$. Then you can complete $x$ to a basis $x,x_2,\dotsc,x_n$ of $V$ and define a unique linear map $g\colon V\rightarrow F$ by $g(x)=1$ and $g(x_i)=0$ for $i=2,\dotsc,n$ and extending by linearity. But this contradicts the hypothesis that $f(x)=0$ for all $f\in V^*$.

Claim: $\dim V_i = 1$ and $x\in V_i\setminus\{0\}$ implies $f_i(x)\neq 0$.
For if $x,y\in V_i$, $x\neq 0$ we must have $f_i(x) \neq 0$ [since otherwise $f_j(x) = 0$ for all $j=1,\dotsc,n$ and hence $f(x) = 0$ for all $f\in V^*$ since $f_1,\dotsc,f_n$ is a basis of $V^*$; but then the lemma implies $x=0$, contradicting $x\neq 0$]. Let $\lambda\in F$ such that $f_i(y) = \lambda f_i(x)$. Then $f_j(\lambda x -y) = 0$ for all $j=1,\dotsc,n$, hence $\lambda x-y = 0$ by the lemma, i. e. $y = \lambda x$ and this shows that $x$ is a basis of $V_i$, since $y\in V_i$ was arbitrary.

The claim shows that $f_i\big|_{V_i}\colon V_i\rightarrow F$ is an isomorphism. Let $x_i\in V_i$ be the unique element such that $f_i(x_i) = 1$. Then $x_i$ is the unique element in $V$ such that $f_j(x_i) = \delta_{ij}$ for all $j=1,\dotsc,n$.

What is left to show is that $x_1,\dotsc,x_n$ is a basis of $V$.
Let $0 = \sum_{i=1}^n\lambda_ix_i$ be a linear combination for some $\lambda_i\in F$. Then $$ 0 = f_j(0) = f_j\left(\sum_{i=1}^n\lambda_ix_i\right) = \sum_{i=1}^n\lambda_if_j(x_i) = \lambda_j,$$ hence $\lambda_1=\dotsb=\lambda_n =0$, i. e. $x_1,\dotsc,x_n$ is linearly independent. Let $y\in V$ be arbitrary. Define $\lambda_i:= f_i(y)$. Then $$f_j\left(y-\sum_{i=1}^n\lambda_ix_i\right) = f_j(y) - \sum_{i=1}^n\lambda_i f_j(x_i) = \lambda_j-\lambda_j=0,$$ hence $y - \sum_{i=1}^n\lambda_ix_i=0$ by the lemma and this means that $x_1,\dotsc,x_n$ spans $V$.