Rank one orthogonal projector matrix.

4.1k Views Asked by At

My text is covering projector matrices while building up to Householder triangularization. The main topic of discussion is orthogonal projector matrices that satisfy

\begin{align} P &= P^2 \tag{1} \\ P &= P^* \tag{2} \end{align}

It turns out that we can form a rank one orthogonal projector with any orthonormal vector $q \in \mathbb{C}^m$

\begin{align} P_q = qq^* \end{align}

which is easily verified to satisfy (1) and (2). $P_q$'s rank $m-1$ complement is then found by $P_{\perp q} = I - P_q$. So far these facts and definitions all make sense to me. But I'm having a little trouble with the following: my book goes on to say that analogous projector matrices for arbitrary nonzero vectors $a$ can be written

\begin{align} P_a &= \frac{aa^*}{a^*a} \\ \\ P_{\perp a} &= I - P_a \end{align}

It's easy to verify that these formulas satisfy (1) and (2) but what is the motivation for the scaling factor of $a^*a$? For example

\begin{align} P_av = \left(\frac{a^*v}{a^*a}\right)a \implies \| P_av\|_2 =\frac{\left|a^*v\right|}{\|a\|_2^2} \|a\|_2\ = \frac{\left|a^*v\right|}{\|a\|_2} \tag{*} \end{align}

Why the scaling factor? Is it true that $a^*v = \|a\|_2\|v\|_2\cos(a,v)$ for higher dimensional complex vectors? I that case (*) becomes

\begin{align} \|v\|_2 \left|\cos(a,v)\right| \end{align}

In hindsight that seems to make sense. $P_av$ is just putting $v$ on $a$ and scaling it to be an orthogonal projection. I guess my true question is: is the 2-norm function equivalent to the modulus function in complex spaces? In particular, for $x,y \in \mathbb{C}^m$ prove

$$x^*y = \|x\|_2\|y\|_2 \cos \alpha $$

3

There are 3 best solutions below

0
On BEST ANSWER

If $a\ne0$, then $$ q=\|a\|^{-1}a $$ is a norm $1$ vector generating the same subspace as $a$. Then the orthogonal projector is $$ qq^*=\|a\|^{-2}aa^*=\frac{1}{a^*a}aa^* $$

0
On

Note that if we set $P = aa^*$ then $P^2=(aa^*)(aa^*)=a(a^*a)a^*$ and since $a^*a$ is a scalar we will have $P=P^2$ only when $a^*a=1$ and by normalizing the vector we ensure this is the case. If we don't do this then it will still project the vector into the subspace generated by $a$ but not orthogonally so there is some stretching that happens to the vectors. Since orthogonal projections are easier to work with and don't add much complexity to the computation we tend to prefer them.

0
On

As I can clearly see the text seems to be from book, Numerical Algebra by Trefethen. Just look Into section about inner products

By definition ,Euclidean Norm is square root of inner product. and Cosine of angle between two vectors can be expressed as $$\cos{(\alpha)} = \frac{(x\cdot y)}{||x||||y||}$$