Change of basis question from an exercise about finite Banach spaces and their duals

63 Views Asked by At

The exercise in question is from Hunter's Applied Analysis, specifically exercise 5.2.

Here is given an n-dimensional space $X$, with bases $\{e_1,\ldots,e_n\}$ and $\{\tilde{e}_1,\ldots,\tilde{e}_n\}$. The bases are related by a matrix $(L_{ij})$ with inverse $(\tilde{L}_{ij})$, such that $$ \tilde{e}_i = \sum_{j=1}^n L_{ij}e_j;\qquad e_i = \sum_{j=1}^n \tilde{L}_{ij}\tilde{e}_j;\qquad \delta_{ik} = \sum_{j=1}^n L_{ij}\tilde{L}_{jk}, $$ where $\delta_{ik}$ is the Kronecker delta function.

Additionally we are given the dual bases $\{\omega_1,\ldots,\omega_n\}$ and $\{\tilde{\omega}_1,\ldots,\tilde{\omega}_n\}$ of $X^*$.

To prove:

a) If $x=\sum x_ie_i = \sum\tilde{x}_i\tilde{e}_i\in X$, then $$\tilde{x}_i = \tilde{L}_{ij}x_j.$$

b) If $\phi=\sum\phi_i\omega_i = \sum\tilde{\phi}_i\tilde{\omega}_i\in X^*$, then $$\tilde{\phi}_i=L_{ji}\phi_j.$$

I am still stuck at a) (and would not appreciate answers for b) just yet, maybe perhaps small hints), and what I managed to derive is that $$ \sum_{i=1}^nx_ie_i = \sum_{i=1}^nx_i\sum_{j=1}^n\tilde{L}_{ij}\tilde{e}_j = \sum_{j=1}^n\tilde{e}_j\sum_{i=1}^n\tilde{L}_{ij}x_i=\sum_{j=1}^n\tilde{e}_j\tilde{x}_j, $$ which implies that $\tilde{x}_i = \sum_{j=1}^n\tilde{L}_{ji}x_j$. This should make sense, and every textbook and online source I searched lead me to a similar answer, which contradicts a). Notice that even the indices are switched, and nothing is stated about $j$ in the question. Am I missing something here, or there is simply an error in the exercise?

1

There are 1 best solutions below

1
On BEST ANSWER

I don't think you're missing anything here, and I think Kavi is right...or the book made an error.

In particular, you can prove that the assumptions imply that $\overline{x}_j=L_{ij}x_j$ for all $j$ does not necessarily happen.

Your derivation shows that $$\sum_{j=1}^n \overline{x}_j\overline{e}_j=\sum_{j=1}^n \overline{e}_j\left[ \sum^n_{i=1} \overline{L}_{ij} x_i \right],$$ hence $\overline{x}_j=\sum^n_{i=1} \overline{L}_{ij} x_i$. So $\overline{x}_j=\overline{L}_{ij}x_j$, for all $j$ implies that $\overline{x}_j=0$, hence $x=0$.

But we have the simple counterexample of $X=\mathbb{R}$, $e_1=\overline{e}_1=1$, $L_{ij}=[1]$, and $x=1$, showing that this doesn't necessarily happen.

I think it's safe to assume what Kavi did and interpret the book as using the Einstein summation convention.

I'm going to respect your wishes and not give an answer (or a hint) for part (b).