Let $V$ be a finite dimensional vector space and $T:V\to V^{*}$ be a isomorphism of vector space. Can we say that there exists a basis of $V$ such that $T$ maps that basis to its dual basis. I have no idea whether the answer is in positive or negative. I didn't find this question in any book, it just came to my mind suddenly, and I couldn't find any explanation. Any help will really be appreciated
A question on isomorphism between a vector space and its dual space
583 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
$\def\diag{\operatorname{diag}}$ As observed previously, there are two reasonable interpretations of the original question. Let $V$ be an $n$--dimensional vector space over a field $k$. For arbitrary invertible $T: V \to V^*$ is it true that there is a basis $\{v_i\}$ of $V$ such that, with $\{v_i^*\}$ the dual basis of $V^*$,
(a) $ T(v_i) = v_i^*$ fior all $i$, or such that
(b) $(T(v_1), T(v_2), \dots, T(v_n)\}$ is a permutation of $(v_1^*, v_2^*, \dots, v_n^*)$ ?
The answer to both questions turn out to be "no" in general, so that takes care of the question as originally posed. One might also ask for necessary and sufficient conditions for the existence of such a basis. Here, one can give a pretty satisfactory answer for (a), but I have not succeeded in doing so for (b).
Let's identify $V^*$ with $V$, by choosing some ordered basis of $V$, say $(b_1, \dots, b_n)$ and defining the standard non-degenerate bilinear form $\langle \sum_i \alpha_i b_i, \sum_j \beta_j b_j\rangle = \sum_i \alpha_i \beta_i$. Using the basis, identify both $V$ and $V^*$ with $k^n$, with the standard basis $(e_1, \dots, e_n)$, and the pairing of $V$ and $V^*$ given by $\langle x, y \rangle = \sum x_i y_i$. Then $T : V \to V^*$ can be identified with the matrix of $T$ with respect to the standard basis, $T = (\langle T(e_i), e_j \rangle)_{i, j}$. We have a new non-degenerate bilinear form on $V$ defined using $T$, namely $(( x, y)) = \langle T(x), y \rangle$. Now our question concerns possible conditions on $k$ and $T$ for the existence of a basis $(v_1, \dots, v_n)$ of $V$ such that
(a) $((v_i, v_j)) = \langle T(v_i), v_j \rangle = \delta(i, j)$, or such that
(b) the matrix $((v_i, v_j)) = \langle T(v_i), v_j \rangle$ is a permutation matrix.
Let's address the first question (a). The condition (a) requires that the bilinear form $((\ , \ ))$ be symmetric, or equivalently the matrix $T$ be symmetric. So this is a necessary condition. Now we are looking for an orthonormal basis $(v_1, \dots, v_n)$ for this symmetric bilinear form. So question (a) translates into: does an arbitrary non-degenerate symmetric bilinear on $k^n$ admit an orthonormal basis. The answer is "yes" if $k$ is quadratically closed, that is every non-zero element has a square root, because a variant of Gram-Schmidt works, but requires taking square roots. The answer is no in general. For example, as is well known, non-degenerate symmetric bilinear forms over $\mathbb R^n$ are classified by signature (Sylvester's theorem). The classification of non-degenerate symmetric bilinear forms over arbitrary fields is very complicated.
To summarize: the answer to (a) is "no" if the bilinear form $((\ , \ ))$, or equivalently the matrix $T$, is not symmetric. If the bilinear form is symmetric and the field is quadratically closed, then the answer to (a) is "yes". If the bilinear form is symmetric and the field is not quadratically closed, the answer to (a) is "no" in general.
It is also instructive to express conditions for (a) to be valid as conditions on $T$. Let $(v_1, \dots, v_n)$ be an ordered basis of $V$ and let $U$ be the matrix whose columns are $v_1, v_2, \dots v_n$, so $v_i = U e_i$. Then condition (a) is $(\langle T U e_i, U e_j\rangle)_{i, j} = \delta(i, j)$, or $U^t T U = I$. Thus (a) holds if and only if there is an invertible $W = U^{-1}$ such that $T = W^t W$. In case the field $k$ is the field $\mathbb R$ of real numbers, the condition is that $T$ is positive definite.
Now let's consider question (b). Supposing (a) fails, when does (b) still hold? I don't have much to say about this. Supposing (b) holds for some basis $(v_1, \dots, v_n)$ and some non-identity permutation matrix $\pi$, Let $U$ be the matrix whose columns are $v_1, v_2, \dots v_n$, so $v_i = U e_i$. Then the condition in (b) is $(\langle T U e_i, U e_j\rangle)_{i, j} = \pi$, or $U^t T U = \pi$. Thus (b) holds if and only if there is an invertible matrix $W = U^{-1}$ and a permutation matrix $\pi$ such that $T = W^t \pi W$. This is not an especially satisfying answer. No doubt, one can do better.
{\bf An example and a counterexample for (b)}. Let $k = \mathbb R$. Let $v_1 = (1/\sqrt{2}) \begin{pmatrix}1\\1\end{pmatrix}$, and $v_2 = (1/\sqrt{2}) \begin{pmatrix}1\\-1\end{pmatrix}$. Then $(v_1, v_2)$ is an ordered orthonormal basis of $\mathbb R^2$ with respect to the standard bilinear form $\langle x, y \rangle = x_1 y_1 + x_2 y_2$, and the matrix $U = (v_1, v_2)$ is real symmetric and orthogonal, $U = U^t = U^{-1}$.
Let $T = \diag(1, -1)$. Then $T v_1 = v_2 , T v_2 = v_1$, so $U^t T U = \pi = \begin{pmatrix}0 & 1\\ 1 & 0\end{pmatrix}$. Thus $T$ satisfies (b) with the non--trivial permutation $\pi$. This shows in particular that $\pi$ is symmetric but neither positive or negative definite. (It has eigenvalues $1, -1$.)
Now let $S = - I$, minus the identity matrix. Can $S$ satisfy (b) over the real numbers? The answer is "no", for the following reason. $S$ is not positive definite, so it cannot satisfy (a). If it satisfies (b), it must do so with respect to the only non-identity permutation matrix namely $\pi = \begin{pmatrix}0 & 1\\ 1 & 0\end{pmatrix}$. That is, there must be some invertible matrix $W$ such that $\pi = W^t S W = - W^t W$. But this says that $\pi$ is negative definite, a contradiction.
Thus the answer to question (b) is "no" in general.
Please check your understanding of the problem, he starts with a given isomorphism and asks for a basis $\{x_i\}$ such that $\{Tx_i\}$ is its dual.
The answer is yes. The map $T$ defines an inner product, $$(x,y)=(Ty)(x)$$ So all that is required is an orthonormal basis for this inner product. A well known construction.