Other inner products for $\mathbb{R}^n$

1.3k Views Asked by At

For $\mathbb{R}^n$, the standard inner product is the dot product. It is defined as $ \langle v,\,w\rangle = \sum_i v_i \cdot w_i $. I am aware that any scaled version, namely $ \langle v,\,w\rangle = \sum_i\lambda_i\cdot v_i \cdot w_i $ will still satisfy the 4 inner product requirements.

Is there any inner product for $\mathbb{R}^n$ that is not just a scaled version of the standard dot product?

I tried for $\mathbb{R}^2$ with $ \langle v,\,w\rangle = v_1 \cdot w_2 + v_2 \cdot w_1 $ but that is not positive definite.

7

There are 7 best solutions below

0
On

For any invertible linear transformation $A$ you can define the inner product $\langle v,w\rangle_A=\langle Av,Aw\rangle$ where $\langle\cdot,\cdot\rangle$ denotes the standard inner product. I expect there are no other inner products, which is motivated by the fact that all inner products are known to induce equivalent norms.

0
On

Inner products $p(x,y)$ on $\mathbb R^n$ have the form $$ p(x,y) = \sum_{j=1}^n \sum_{k=1}^n a_{jk} x_j y_k $$ where the matrix $A = [a_{jk}]$ is positive definite. Choosing a basis of eigenvectors for the matrix $A$, and expanding according to this new basis instead of the original basis, the inner product is then diagonal: $$ p(x,y) = \sum_{j=1}^n b_j x_j y_j $$ where $b_j>0$.

2
On

Yes, for $n > 1$. For any $n \times n$ matrix $A$, $$\phantom{(\ast)} \qquad \langle {\bf x}, {\bf y}\rangle := {\bf y}^{\top} A {\bf x} \qquad (\ast)$$ defines a bilinear form on $\Bbb R^n$. If $A$ is symmetric, then so is the bilinear form, i.e., $$\langle {\bf y}, {\bf x}\rangle = \langle {\bf x}, {\bf y}\rangle ,$$ and in that case all of the eigenvalues of $A$ are real.

We can show that the bilinear form $(\ast)$ is in fact an inner product iff the eigenvalues of $A$ are all positive, so to establish the existence of an inner product not of the form $\langle {\bf x}, {\bf y}\rangle = \sum_{i = 1}^n \lambda_i x_i y_i$ it's enough to find a symmetric matrix $A$ that is not diagonal but whose eigenvalues are all positive.

A simple example is $$\pmatrix{1&\epsilon\\\epsilon&1\\&&1\\&&&\ddots\\&&&&1} ,$$ which corresponds to the bilinear form $$\langle {\bf x}, {\bf y}\rangle = x_1 y_1 + \cdots + x_n y_n + \epsilon (x_1 y_2 + x_2 y_1) . $$ The eigenvalues of this matrix are $1 - \epsilon, 1, 1 + \epsilon$, so via $(\ast)$ this bilinear form is an inner product iff $|\epsilon| < 1$.

Remark Conversely, all inner products can be written as $(\ast)$ for some symmetric matrix $A$, and we can recover $A$ by setting $$A_{ij} = {\bf e}_i \cdot {\bf e}_j$$ for the standard basis $({\bf e}_i)$.

On the other hand, given any inner product on $\Bbb R^n$, applying the Gram-Schmidt Process produces an orthonormal basis $({\bf f}_i)$, so the matrix representation of the inner product with respect to that basis is the identity matrix, $I_n$. In this sense, all inner products on $\Bbb R^n$ are equivalent.

0
On

I agree with SmileyCraft. In finite dimensional vector spaces, bilinear transformations, as linear transformations, can be written in terms of the values that they adopt in a given base:$$\left \langle x,y \right \rangle=\sum_{i,j=1}^{n}x_iy_j\left \langle e_i,e_j \right \rangle.$$ I believe you can arrive in this representation without difficult, proving then you suspicion.

2
On

Technically, you need positive $\lambda_i$. Or if we use $\sum_{ij}\lambda_{ij}v_iw_j$, the matrix $\lambda$ is without loss of generality equal to $(\lambda+\lambda^T)/2$, and it has to be positive-definite. (Yes, this matrix property has the same name; it basically means it has only positive eigenvalues.) With an appropriate basis change we can then diagonalize this matrix, which recovers the case you knew about. As for the example you tried, it failed because if you work out the matrix $\lambda=\left(\begin{array}{cc} 0 & 1\\ 1 & 0 \end{array}\right)$ (once we make it self-adjoint as explained above), which has $-1$ as an eigenvalue.

0
On

It is well known (and easy to prove) that any two finite dimensional inner product spaces are isometrically isomorphic. Hence $ \langle x, y \rangle'$ is an inner product on $\mathbb R^{n}$ iff there is a vector space isomorphism $T: \mathbb R^{n} \to \mathbb R^{n}$ such that $\langle x, y \rangle' =\langle Tx, Ty \rangle$ for all $x,y$.

0
On

In general, for any $n\times n$ matrix $A=(a_{i,j})_{i,j=1,\ldots,n}$ the expression $\sum_{i,j=1}^na_{i,j}x_iy_j$ defines a bilinear form, which will be symmetric if and only $A$ is. Giving a positive definite symmetric bilinear form is a more subtle condition that leads to inequalities for the coefficients of $A$ (and the matrices that satisfy the condition are naturally called positive definite). For $2\times2$ symmetric matrices the positive definite condition is $a_{1,1}>0$, $a_{2,2}>0$ together with $a_{1,1}a_{2,2}-a_{1,2}^2>0$ (so $\det(A)>0$). For a concrete example, the symmetric matrix $$ A=\pmatrix{1&\frac12\\\frac12&1} \quad\text{gives an inner product with } \langle v,\,w\rangle = v_1w_1 +\frac12(v_1w_2+v_2w_1) + v_2w_2\,. $$

In higher dimension the condition is more complicated, but in any case one does get many different inner products on $\Bbb R^n$ in this way. They do turn out to be all equivalent in the sense that they give rise to the same structure theory, but they are not equal.