Matrix-vector multiplication expressed as inner products with elements of an orthonormal basis

35 Views Asked by At

Let $A$ be a symmetric $n\times n$ matrix and $x$ a row vector of size $n$. I've encountered the following identity in some lecture notes and am having a hard time understanding it:

$$xA = \sum_{i=1}^{n}\left\langle x,u_{i}\right\rangle Au_{i}$$

Here $u_1, \ldots , u_n$ are elements of an orthonormal basis of the space. Why is this true?

(This is one part of a proof, and the motivation being that $A$ is diagonizable w.r.t. the $u_i$ which factors in to the subsequent analysis)

1

There are 1 best solutions below

4
On BEST ANSWER

This is not true, the correct identity is the following: $$Ax = \sum_{i=1}^n\langle x,u_i\rangle Au_i$$ where $x$ is an $n\times 1$ column vector.

Proof 1

Let $A = (\vec a_1\ \ \vec a_2\ \ \vec a_3\ \ …\ \ \vec a_n)$ and $x = \pmatrix{x_1\\x_2\\.\\.\\.\\x_n}$, then $$LHS = \sum_{i=1}^nx_i\vec a_i = RHS$$

Proof 2

We know that identity matrix $I = \sum_{i=1}^nu_iu_i^T$. Hence, $$LHS=Ax$$ $$ = A\sum_{i=1}^nu_iu_i^T x$$ $$ = \sum_{i=1}^n(Au_i)(u_i^Tx)$$ $$ = \sum_{i=1}^n\langle x,u_i\rangle Au_i$$

Hence proved