Geometric intuition on $\langle x, A^\top y\rangle = \langle y, Ax\rangle$

372 Views Asked by At

I am looking for a geometric intuition on $\langle x, A^\top y\rangle = \langle y, Ax\rangle$. This can be proven algebraically by disassembling the expression into basic sums and product and reordering a few terms. But that does not offer any insight.

Semantically this equality states that the scaled projection (dot product) of $x$ with a linear combination of the rows of $A$ weighted by $y$ is equal to the scaled projection of $y$ onto linear combination of the columns of $A$ weighted by $x$. But I fail to understand this in an intuitive geometric way. Do you know of any insightful interpretation?

If this equality has a name, that would also be useful. Right now I cannot really research it without a name.

I came across the equation here: Eigenvectors of real symmetric matrices are orthogonal

2

There are 2 best solutions below

0
On

Use the singular value decomposition $A=P\Sigma Q$ where $P$ and $Q$ are orthogonal, and $\Sigma$ is diagonal. Regard $A$ as acting on $V$ and $A^T$ as acting on the isometric dual space $V^*$. The SVDs of $A$ and $A^T$ make it clear that the "geometry" of the action of one is the same as the action of its transpose partner on the dual. The equality $\langle x,A^Ty\rangle=\langle y,Ax\rangle$ simply unravels this via the inner product, which of course provides the isometry between the two space.

0
On

The OP is interested in the geometric intuition behind

$$\tag 0 \langle x, A^\top y\rangle = \langle y, Ax\rangle$$

and asked for the name of this property.

To truly appreciate the beauty of $\text{(1)}$ you have to pass into the realm of abstract linear algebra. To do that, one can begin by ruminating on the following snippet taken from this historical summary,

The first modern and more precise definition of a vector space was introduced by Peano in 1888; by 1900, a theory of linear transformations of finite-dimensional vector spaces had emerged. Linear algebra took its modern form in the first half of the twentieth century, when many ideas and methods of previous centuries were generalized as abstract algebra.

(emphasis mine)

I tried to find the name of the property described in $\text{(0)}$ and have come to the conclusion that whenever you see it, you must simply 'drop your jaw' in amazement. If you want to use it, simply precede the use by writing

$$\quad \text{Since ...}$$

OK, I have to leave the OP with something better than the above. If you want to give $\text{(0)}$ a name, call it Theorem AIP and refer to this link.

Below I copy the theorem and the proof. As an exercise, the OP can restate all the definitions/theorems/proofs by replacing the complex numbers with the real numbers, where for $x \in \Bbb R$, $\,\bar x = x$. While working on this, the OP can attempt to attach geometric intuitions to the ideas and concepts as they are analyzed.

Theorem AIP: Adjoint and Inner Product. Suppose that $A$ is an $m \times n$ matrix and $\vec x \in {\Bbb C}^n$, $\vec y \in {\Bbb C}^m$. Then

$$\tag 1 \langle A \vec x, \vec y\rangle = \langle \vec x, A^{*} \vec y\rangle$$

Proof

enter image description here


I examined the proof of Theorem AIP and found myself wondering why, intuitively,

$$\tag 2 (AB)^t = B^t A^t$$

and found this stack link

Why, intuitively, is the order reversed when taking the transpose of the product?

That lead me to write up the following answer, where the result is proved and due to our intuition, it comes as no surprise.