Is every basis of a finite-dimensional vector space orthonormal with respect to some inner product?

2k Views Asked by At

Given a real or complex vector space $V$ and a (finite) basis $B$ of it, does it always exist an inner product on $V$ such that $B$ is an orthonormal basis with respect to it?

The question is equivalent to asking: is there always a positive definite (symmetric) matrix $A$ such that if $B=\{v_1,...,v_n\}$, then

$$v_i^tAv_j=\delta_{ij}\;?$$

4

There are 4 best solutions below

2
On

Let's do this for the real case. A vector $x$ can be written as $$ x=\sum_i x^i v_i $$ where $x^i$ are real numbers. Then for another vector $y=\sum_i y^iv_i$ we can define an inner product via the formula

$$ \langle x,y\rangle=\sum_i x^i y^i. $$ The formula in my comment above is the same expression. I only defined it for the basis vectors, and assumed a bilinear extension of it.

0
On

Let $\mathbb{K}$ be $\mathbb{R}$ or $\mathbb{C}$. On the $\mathbb{K}$-linear space $V$, an inner product is a bilinear form $\varphi \, : \, V \times V \, \longrightarrow \, \mathbb{K}$ which is symmetric positive definite. Let $B = \big( \varepsilon_{1},\ldots, \varepsilon_{n}\big)$ be a basis of $V$ and let $(x,y) \in V^{2}$. We can write :

$$ x = \sum_{i=1}^{n} x_{i} \varepsilon_{i} \quad \mathrm{and} \quad y=\sum_{j=1}^{n} y_{j}\varepsilon_{j} $$

where $(x_{1},\ldots,x_{n}) \in \mathbb{K}^{n}$ and $(y_{1},\ldots,y_{n}) \in \mathbb{K}^{n}$. By bilinearity of $\varphi$, we have :

$$ \begin{align*} \varphi(x,y) &= {} \varphi \Big( \sum_{i=1}^{n} x_{i} \varepsilon_{i} \, ,\sum_{j=1}^{n} y_{j}\varepsilon_{j} \Big) \\[1mm] &= \sum_{i,j=1}^{n} x_{i}y_{j} \varphi \big( \varepsilon_{i},\varepsilon_{j} \big) \end{align*} $$

As a consequence, in order to define $\varphi$, one must specify the value of $\varphi(\varepsilon_{i},\varepsilon_{j})$ for all $i$ and $j$. Choosing : $\forall i,j, \, \varphi(\varepsilon_{i},\varepsilon_{j}) = \delta_{i,j}$, you obtain an inner product on $V$ for which $B$ is an orthonormal basis.

0
On

I use your equivalent question ie finding symmetric matrix A s.t. if B={v1,...,vn} , then

$v_i^tAv_j=\delta_{ij}\;$

We know $A$ has $(n^{2}+n)/2$ unknown parameters (same as the number of equations we have) and It is so easy to prove that if such an $A$ were exist It is always positive definite. So It is suffices to show that this system has a solution. You know I prove it (It is better to say some calculations instead of proof) for $\;V\;$=$\mathbb{R}^{2}$ with direct method and think It can be proven for other dimensions similarly.

Suppose $v_1=[a_1,a_2]^t,v_2=[b_1,b_2]^t,\mathbf{A}=\left[\begin{array}{*{20}{c}} {x}_{1}&{x}_{2}\\ {x}_{2}&{x}_{3} \end{array}\right]$

then we obtain $\left[\begin{array}{*{20}{c}} {a}_{1}^2&2{a}_{1}{a}_{2}&{a}_{2}^2\\ {a}_{1}{b}_{1}&{a}_{1}{b}_{2}+{a}_{2}{b}_{1}&{a}_{2}{b}_{2}\\{b}_{1}^2&2{b}_{1}{b}_{2}&{b}_{2}^2 \end{array}\right]$$\left[\begin{array}{*{20}{c}} {x}_{1}\\ {x}_{2}\\{x}_{3} \end{array}\right]$=$\left[\begin{array}{*{20}{c}} 1\\ 0\\1 \end{array}\right]$

It is not difficult to show that left $3$ by $3$-matrix is invertible ($v_1$ and $v_2$ are linearly independent) and We have always exactly $1$ solution. Only thing remains is to show for dimension $n$ this $(n^{2}+n)/2$ by $(n^{2}+n)/2$ matrix is invertible (:

0
On

Another answer with a slightly different approach: what you're asking about here is a statement on bilinear forms or sesquilinear forms, in the complex case. For convenience, I'll stick to the complex case, but all of this can, with care, be handled in the case of a real vector space.

Note that an inner product $(\cdot,\cdot):V \times V \to \Bbb C$ on a complex vector space $\Bbb C$ is simply a sesquilinear form with the additional property that:

  • $(x,x) \geq 0$ for all $x$
  • In the real case: $(x,y) = (y,x)$ (this follows from the above over $\Bbb C$)
  • $(x,x) = 0 \iff x = 0$

From there, we use the fact that every sesquillinear form can be written as $$ (x,y) = x^* Ay $$ for some complex matrix $A$, whose entries are $$ a_{ij} = e_i^* A e_j $$ By the above, the three requirements on our sesquilinear form are precisely the statement that $A$ is positive definite.