Two inner products being equal up to a scalar

3.9k Views Asked by At

I would appreciate a hint on the following problem:

Let $V$ be a finite dimensional vector space over $F$. There are two scalar products such that: $$ \forall \ w,v \in V \ \Big(\langle v,w\rangle_1=0 \implies \langle v,w\rangle_2=0\Big) $$ Show that $$ \exists \ c \in F \ \ \forall \ w,v \in V \ \Big(\langle v,w\rangle_1=c \langle v,w\rangle_2\Big) $$

I have tried to define an orthonormal basis with respect to $\langle \cdot,\cdot\rangle_1$ hoping that the transformation matrices would be different by a constant, yet it seems to lead nowhere.

3

There are 3 best solutions below

0
On BEST ANSWER

Since you are in finite dimension, you can do this by induction on the dimension.

In dimension${}\leq1$ the inner product structures are easy enough to classify as a standard inner product multiplied by some real $c>0$, giving the result. In dimension${}>1$ you can fix any nonzero vector$~v$, and induction will give you the result for its orthogonal complement$~H$, which is the same hyperplane for both inner product structures by hypotheses; in particular it will give you a (positive real) constant$~c$ valid on$~H$. Choosing a vector $h\in H$ with $\langle h,h\rangle_1=\langle v,v\rangle_1$ (which is easily found), one has $$ \langle h+v,h-v\rangle_1=\langle h,h\rangle_1-\langle v,v\rangle_1=0 $$ so using the hypothesis also $$ 0= \langle h+v,h-v\rangle_2 = \langle h,h\rangle_2-\langle v,v\rangle_2 $$ and therefore $\langle v,v\rangle_2=\langle h,h\rangle_2=c\langle h,h\rangle_1=c\langle v,v\rangle_1$. Now complete by sesquilinearity.

By the way, note that $\langle x+y,x-y\rangle=\langle x,x\rangle-\langle y,y\rangle$ does not hold in general for complex (so conjugate-symmetric) inner products, but it holds for $x=h$ and $y=v$ since $h\perp v$.

0
On

Let $\mathcal{B} := \{ v_1, v_2, \dots, v_n \}$ be an $1$-orthonormal basis for $V$, i.e., $\mathcal{B}$ is orthonormal with respect to $\langle \cdot, \cdot \rangle_1$. Obviously, it is $2$-orthogonal (but not necessarily $2$-normalized).

Now, let us observe two vectors

$$x := \sum_{i=1}^n \alpha_i v_i, \quad y := \sum_{i=1}^n \beta_i v_i.$$

We compute $\langle x, y \rangle_1$ and $\langle x, y \rangle_2$:

\begin{align*} \langle x, y \rangle_1 &= \left\langle \sum_{i=1}^n \alpha_i v_i, \sum_{i=1}^n \beta_i v_i \right\rangle_1 = \sum_{i=1}^n \alpha_i \beta_i \langle v_i, v_i \rangle_1 = \sum_{i=1}^n \alpha_i \beta_i, \\ \langle x, y \rangle_2 &= \left\langle \sum_{i=1}^n \alpha_i v_i, \sum_{i=1}^n \beta_i v_i \right\rangle_2 = \sum_{i=1}^n \alpha_i \beta_i \langle v_i, v_i \rangle_2. \end{align*}

Let us now fixate $u := \sum_{i=1}^n v_i$ and let us observe vectors

$$w_{ij} := v_i - v_j = \sum_{k=1}^n \beta^{(ij)}_k v_k, \quad \text{for $i < j$},$$

where

$$\beta^{(ij)}_k = \begin{cases} 1, & k = i, \\ -1, & k = j, \\ 0, & \text{otherwise}. \end{cases}$$

Obviously, for this choice of $u$, we have $\alpha_k = 1$ for all $k$. Notice that $\langle u, w_{ij} \rangle_1 = 1 - 1 = 0$ for all $i < j$. Therefore,

$$\langle u, w_{ij} \rangle_2 = 0$$

for all $i < j$. Let us expand this last one:

$$0 = \langle u, w_{ij} \rangle_2 = \sum_{k=1}^n \alpha_k \beta^{(ij)}_k \langle v_k, v_k \rangle_2 = \langle v_i, v_i \rangle_2 - \langle v_j, v_j \rangle_2.$$

In other words, for all $i,j$ (such that $i < j$, but that's irrelevant), we have

$$\langle v_i, v_i \rangle_2 = \langle v_j, v_j \rangle_2,$$

which means that $c := \langle v_k, v_k \rangle_2$ is a well defined constant, invariant of the choice of $k \in \{1,2,\dots,n\}$. Therefore,

$$\langle x, y \rangle_2 = \sum_{i=1}^n \alpha_i \beta_i \langle v_i, v_i \rangle_2 = \sum_{i=1}^n \alpha_i \beta_i c = c \sum_{i=1}^n \alpha_i \beta_i = c \langle x, y \rangle_1.$$

3
On

Presumably $F=\mathbb{R}$ or $\mathbb{C}$. For every nonzero vector $v\in V$, define $c_v=\dfrac{\langle v,v\rangle_2}{\langle v,v\rangle_1}$. Now, for any $v,w\in V$, let $x=w-\dfrac{\langle w,v\rangle_1}{\langle v,v\rangle_1}v.\ $ Then $\langle x,v\rangle_1=0$ (here we adopt the convention that the inner product is linear in the first argument). Hence $0=\langle x,v\rangle_2$ and in turn $$\langle w,v\rangle_2\equiv c_v\langle w,v\rangle_1\tag{1}$$ for all nonzero $v,w\in V$. Therefore $$ c_v\langle w,v\rangle_1=\langle w,v\rangle_2=\overline{\langle v,w\rangle_2}=\overline{c_w\langle v,w\rangle_1}=c_w\langle w,v\rangle_1\quad\forall v,w\neq0.\tag{2} $$ Now, for any $v,w\neq0$, there exists some $y\in\{w+tv:t\in\mathbb{R}\}$ such that $\langle w,y\rangle_1,\ \langle y,v\rangle_1\ne0$. By $(2)$, we have $c_v\langle y,v\rangle_1=c_y\langle y,v\rangle_1$ and $c_y\langle w,y\rangle_1=c_w\langle w,y\rangle_1$. Hence $c_v=c_y=c_w$ for all $v,w\neq0$, i.e. all the $c_v$s are equal to some common constant $c>0$. So, the result follows from $(1)$.