Is it true that $\biggl\|I-\frac{vv^T}{v^Tv}\biggr\|=1$?

544 Views Asked by At

I am following a proof in the text OPTIMIZATION THEORY AND METHODS a springer series by WENYU SUN and YA-XIANG YUAN. I come across what seems obvious that for a column vector $v$, with dimension $n\times 1$, $$\biggl\|I-\frac{vv^T}{v^Tv}\biggr\|=1,$$ where $I$ is an $n\times n$ matrix, and $\|.||$ is a matrix norm.


I try to verify it by considering a Frobenius norm, that is

\begin{equation*} \begin{split} \biggl\|I-\frac{vv^T}{v^Tv}\biggr\|_F& = \biggl (tr\biggl(I-\frac{vv^T}{v^Tv}\biggr)^T\biggl(I-\frac{vv^T}{v^Tv}\biggr)\biggl)^{\frac{1}{2}} \\ & = \biggl (tr\biggl(I-\frac{vv^T}{v^Tv}\biggr)^2\biggr)^{\frac{1}{2}}\\ & =tr\biggl(I-\frac{vv^T}{v^Tv}\biggr)\\ & =tr\bigl(I\bigr)-tr\biggr(\frac{vv^T}{v^Tv}\biggr)\\ & =n-\frac{1}{\|v\|^2} \|v\|^2\\ & =n-1 \end{split} \end{equation*}


So, I do not what is the problem. Because in the text no specification of norm is given. May be I have to change another matrix norm.


NOTE: A Frobenius matrix norm for any matrix $A$ is defined by \begin{equation*} \begin{split} ||A\|_F & = \biggl( \sum_{i=1}^{m}\sum_{j=1}^{n}|a_{ij}|^2\biggr)^\frac{1}{2}\\ & = \biggl(tr(A^TA)\biggr)^\frac{1}{2} \end{split} \end{equation*}

3

There are 3 best solutions below

4
On

$P=I - \frac{v v^T}{v^T v}$ is the orthogonal projection onto $v^\perp$.

Proof: Clearly, $P$ is symmetric.

$P^2 = (I - \frac{v v^T}{v^T v}) (I - \frac{v v^T}{v^T v}) = I - 2 \frac{v v^T}{v^T v} + \frac{v v^T}{v^T v} \frac{v v^T}{v^T v} = I - 2 \frac{v v^T}{v^T v} + \frac{v (v^T v) v^T}{(v^T v)^2} = I - 2 \frac{v v^T}{v^T v} + (v^T v) \frac{v v^T}{(v^T v)^2} = P$

Now show (non-trivial) orthogonal projections have (operator 2-norm) norm 1: $||Px||^2 = <Px,Px> =< x, P^T P x> = <x, P P x> = <x, P^2 x> = <x,P x> \leq ||x|| ||P x|| $ so $||Px || \leq ||x||$. Now show equality is achieved by any vector in the range of the projection (in this case, any vector orthogonal to $v$).

1
On

Letting $A=I-\frac{vv^T}{v^Tv}$ . Then $A$ is symmetric, and its eigenvalues are real. Besides $A x = x - v \frac{v^T x}{v^Tv}=\lambda x$ implies that either $x = \alpha v$, (which gives $\lambda=0$) or $v^T x=0$ which gives $\lambda=1$. Hence it's greater eigenvalue is 1 and, that's the spectral norm.

3
On

I think you should check the notations of that book first. The authors are probably talking about the spectral-norm. Note that spectral norm is completely different from the frobenius norm. For any square matrix $P$ \begin{align} ||P||_{\text{spectral norm}}=||P||_2=\max_{||x||_2 = 1}||Px||_2=\sigma_{1} &&\{\mbox{spectral norm, $\sigma_1$ is highest singular value}\} \\ ||P||_{Frob}=\sqrt{\sum_{i,j}|P_{i,j}|^2}=\sqrt{\mathop{trace}\{P^TP\}}=\sum_{i=1}^{r}\sigma_{i}^2 && \{\mbox{frobenius norm, $r$ is rank of $P$}\} \end{align}

Let $v_1=\frac{v}{||v||_2}$ Let $v_1,v_2,\dots,v_{n}$ form a orthonormal basis for $n$ dimensional space so that they are orthogonal to each other and are of unit norm. Let $V=[v_1,\dots,v_n]$ be a $n\times n$ orthonormal matrix. Convince yourself that $$I=VV^T=v_1v_1^T+\dots+v_nv_n^T$$ Note that the matrix you are interested in is $$P=I-v_1v_1^T=v_2v_2^T+\dots+v_nv_n^T$$ Now try to obtain the above results in terms of singular values.