Matrix representing inner product?

1.7k Views Asked by At

Is is true that given an inner product space $ \left\langle -,- \right\rangle $, there's a matrix $A$ such that $ \left\langle u,v \right\rangle =\left\langle Au,Av \right\rangle _\mathbb{R^n}$? I thought of taking $A$ to be the Gramian but am not sure this holds...

Edit. Okay, I understand I wrote nonsense. I tried to generalize an exercise before solving. I'm supposed to prove that on $V=\mathbb R^2$, given some inner product there's an invertible matrix $A$ such that $ \left\langle u,v \right\rangle =\left\langle Au,Av \right\rangle _\mathbb{R^n}$ where the right inner product is the standard one. How can I prove this?

2

There are 2 best solutions below

1
On BEST ANSWER

The expression $Au$ doesn't make sense as $u$ is an abstract vector and $A$ is a matrix. The Gramian matrix $G(\beta)$ is the matrix representing the inner product (as a quadratic form) with respect to the basis $\beta$. Then we have

$$ \left <v, w \right>_V = [v]_{\beta}^T G(\beta) [w]_{\beta} = \left< G(\beta) [v]_{\beta}, [w]_{\beta} \right>_{\mathbb{R}^n} = \left< [v]_{\beta}, G(\beta) [w]_{\beta} \right>_{\mathbb{R}^n} $$

where $\left< \cdot, \cdot \right>_{\mathbb{R}^n}$ is the standard inner product on $\mathbb{R}^n$. Since $G(\beta)$ is self-adjoint and positive-definite, it has a unique self-adjoint positive-definite root and so you can also write

$$ \left< v, w \right>_V = \left< \sqrt{G(\beta)}[v]_{\beta}, \sqrt{G(\beta)}[w]_{\beta} \right>_{\mathbb{R}^n} $$

which is closer in spirit to what you wrote (with $A = \sqrt{G(\beta)}$).

3
On

Any two (symmetric, positive definite) inner products $\langle\cdot, \cdot\rangle$, $\langle\langle\cdot, \cdot\rangle\rangle$ over $\mathbb{R}^n$ are equivalent:

Let $B = \{b_1, \ldots, b_n\}$ be any basis of $\mathbb{R}^n$ that is orthonormal with respect to $\langle\cdot,\cdot\rangle$ and $C = \{c_1, \ldots, c_n\}$ be any basis of $\mathbb{R}^n$ orthonormal with respect to $\langle\langle\cdot,\cdot\rangle\rangle$.

Choose the linear map $\phi : \mathbb{R}^n \to \mathbb{R}^n$ such that $\phi(b_i) = c_i$ for $i = 1, \ldots, n$. Let $A$ be the transformation matrix of $\phi$. Then $$\langle b_i,b_j\rangle = \delta_{ij} = \langle\langle c_i, c_j\rangle\rangle = \langle\langle\phi(b_i), \phi(b_j)\rangle\rangle = \langle\langle Ab_i, Ab_j\rangle\rangle,$$

so by bilinearity we have for any vectors $x, y \in \mathbb{R}^n$ that

$$\langle x, y\rangle = \langle\langle Ax, Ay\rangle\rangle.$$

This shows your claim.