Inner Products on $\mathbb C^d$ Are Equivalent

65 Views Asked by At

Let $A$ and $B$ be $d\times d$, complex matrices such that $Ax\cdot x>0$ and $Bx\cdot x>0$ for every nonzero $x\in\mathbb C^d$, let $\left\langle x,y\right\rangle_A=Ax\cdot y$, and let $\left\langle x,y\right\rangle_B=Bx\cdot y$.

I showed that $\left\langle\cdot,\cdot\right\rangle_A$ and $\left\langle\cdot,\cdot\right\rangle_B$ are inner products on $\mathbb C^d$.

Let $\left\|x\right\|_A=\sqrt{\left\langle x,x\right\rangle_A}$ and let $\left\|x\right\|_B=\sqrt{\left\langle x,x\right\rangle_B}$.

I want to show that $\left\|\cdot\right\|_A$ and $\left\|\cdot\right\|_B$ are equivalent.

I believe that the right approach is to show $\left\|\cdot\right\|_A$ is equivalent to $\left\|\cdot\right\|_I$, where $I$ is the identity matrix, but I have not had much success.

2

There are 2 best solutions below

1
On BEST ANSWER

The approach you propose is promising.

Try the special case where $A$ is diagonal. (What can you say about the diagonal entries?) Can you show the claim ($\|\cdot\|_A$ equivalent to $\|\cdot\|_I$) in this case?

For the general situation, note that $A$ is symmetric and positive definite, so it is a diagonal matrix with respect to an orthonormal basis, which essentially reduces to the special case above.

0
On

In this case the result is trivial because all norms on a finite-dimensional space are equivalent.

What you are probably interested in is the version for Hilbert spaces:

Let $\big(H, \langle\cdot, \cdot\rangle\big)$ be a Hilbert space and $A : H \to H$ an invertible self-adjoint linear map. Define a positive-semidefinite sesquilinear form $\langle x,y\rangle_A = \langle Ax, y\rangle, \forall x,y \in H$. Then $\|\cdot\|_A$ and $\|\cdot\|$ are equivalent seminorms. In particular $\|\cdot\|_A$ is a norm and $\langle\cdot, \cdot\rangle_A$ is an inner product on $H$.

Proof.

$$\|x\|_A^2 = \langle x, x\rangle_A = \left|\langle Ax, x\rangle\right| \stackrel{CSB}\le \|Ax\|\|x\| \le \|A\|\|x\|^2$$

so $\|\cdot\|_A \le \sqrt{\|A\|}\|\cdot\|$.

For the converse inequality notice that $A^{-1}$ is also bounded by the Bounded Inverse Theorem. $$\|A^{-1}x\|_A^2 = \langle A^{-1}x, A^{-1}x\rangle_A= \langle AA^{-1}x, A^{-1}x\rangle = \langle x, A^{-1}x\rangle \stackrel{CSB}{\le} \|A^{-1}\|\|x\|^2$$ $$\|x\|^2 = \langle x, x\rangle = \left|\langle AA^{-1}x,x\rangle\right| = \left|\langle A^{-1}x,x\rangle_A\right| \stackrel{CSB}\le \|A^{-1}x\|_A\|x\|_A \le \sqrt{\|A^{-1}\|}\|x\|\|x\|_A$$

so $\|\cdot\| \le \sqrt{\|A^{-1}\|}\|\cdot\|_A$.

Therefore, $$\frac{1}{\sqrt{\|A^{-1}\|}}\|\cdot\| \le \|\cdot\|_A \le \sqrt{\|A\|}\|\cdot\|$$ so $\|\cdot\|_A \sim \|\cdot\|$.

Note that we didn't have to assume $\langle Ax,x\rangle > 0, \forall x\ne 0$, we assumed only that $A$ is self-adjoint and invertible. The only place where self-adjointness of $A$ was explicitly used is the application of CSB on the form $\langle \cdot, \cdot\rangle_A$.

In case of $\mathbb{C}^d$, the assumption $\langle Ax,x\rangle > 0, \forall x\ne 0$ implies that $A$ is injective. Since the space is finite-dimensional, $A$ is also surjective and hence invertible.