Proof that a projection minimizes the norm of error

394 Views Asked by At

I'm trying to prove given a vector $\mathbf{x}$ that's an element of $\mathbb{C}^N$, the vector $\hat{x}$ defined as $$\hat{x}=\sum_{k=0}^{K-1}\langle \mathbf{s}^{(k)}\mathbf{x}\rangle\mathbf{s}^{(k)}$$ where $\{\mathbf{s}^{(k)}\}_{0}^{K-1}$ is an orthonormal basis for the subspace S satisfies $\text{arg min}_{\mathbf{y}\in S}(||\mathbf{x}-\mathbf{y}||)=\hat{x}$.

The inner product definition I'm using is $$\langle \mathbf{x},\mathbf{y}\rangle=\sum_{n=0}^{N-1}x^{*}_ny_n$$

The first idea I had was simply expanding $||\mathbf{x}-\mathbf{p}||^2$, since minimizing $||\mathbf{x}-\mathbf{p}||^2$ is equalivalent to minimizing$||\mathbf{x}-\mathbf{p}||$ for an arbitrary $\mathbf{p}$. $$||\mathbf{x}-\mathbf{p}||^2=\langle \mathbf{x}-\mathbf{p}, \mathbf{x}-\mathbf{p} \rangle=\langle \mathbf{x},\mathbf{x}\rangle-\text{Re}(\langle \mathbf{x},\mathbf{p}\rangle)+\langle \mathbf{p},\mathbf{p} \rangle$$

But I'm not sure how to go from here. I also tried writing $\mathbf{p}=\hat{\mathbf{x}}+\mathbf{d}$ to try and use $\langle \mathbf{x}-\hat{\mathbf{x}},\hat{\mathbf{x}}\rangle=0$ which gave $$||\mathbf{x}-\hat{\mathbf{x}}-\mathbf{d}||^2=\langle\mathbf{x}-\hat{\mathbf{x}}-\mathbf{d},\mathbf{x}-\hat{\mathbf{x}}-\mathbf{d}\rangle=\langle\mathbf{x}-\hat{\mathbf{x}},\mathbf{x}-\hat{\mathbf{x}}-\mathbf{d}\rangle-\langle \mathbf{d},\mathbf{x}-\hat{\mathbf{x}}-\mathbf{d}\rangle=\langle\mathbf{x}-\hat{\mathbf{x}},\mathbf{x}\rangle-\langle \mathbf{x}-\hat{\mathbf{x}},\hat{\mathbf{x}}\rangle-\langle\mathbf{x}-\hat{\mathbf{x}},\mathbf{d}\rangle-\langle \mathbf{d},\mathbf{x}-\hat{\mathbf{x}}\rangle+\langle \mathbf{d},\mathbf{d}\rangle=\langle \mathbf{d},\mathbf{d}\rangle+\langle \mathbf{x}-\hat{\mathbf{x}},\mathbf{x}\rangle-2\text{Re}(\langle \mathbf{d}, \mathbf{x}-\hat{\mathbf{x}}\rangle)$$ Which doesn't seem to help either. The last thing I can think of is expressing the norm in terms of some parameter and differentiating and using that to find the minimum, but I'm not sure how I'd do that.

Any hints/answers would be greatly appreciated, thanks!

1

There are 1 best solutions below

4
On BEST ANSWER

We have that for arbitrary $y \in S$ $$\langle x - \hat{x},y - \hat{x} \rangle= \langle x,y \rangle - \langle x,\hat{x} \rangle-\langle \hat{x},y \rangle + \|\hat{x}\|^2$$

$$\langle x,y \rangle=\bigg\langle x ,\sum_{k=0}^{K-1}\langle y,s^{(k)}\rangle s^{(k)}\bigg\rangle = \sum_{k=0}^{K-1}\langle y,s^{(k)}\rangle^*\langle x,s^{(k)}\rangle $$

$$\langle x,\hat{x} \rangle= \bigg\langle x ,\sum_{k=0}^{K-1}\langle x,s^{(k)}\rangle s^{(k)}\bigg\rangle = \sum_{k=0}^{K-1}\langle x,s^{(k)}\rangle^*\langle x,s^{(k)}\rangle$$

$$\langle \hat{x},y \rangle= \bigg\langle \sum_{k=0}^{K-1}\langle x,s^{(k)}\rangle s^{(k)}, \sum_{k=0}^{K-1}\langle y,s^{(k)}\rangle s^{(k)}\bigg\rangle = \sum_{k=0}^{K-1}\langle y,s^{(k)}\rangle^*\langle x,s^{(k)}\rangle$$

$$\|\hat{x}\|^2=\sum_{k=0}^{K-1}|\langle x,s^{(k)}\rangle|^2=\sum_{k=0}^{K-1}\langle x,s^{(k)}\rangle^*\langle x,s^{(k)}\rangle$$

So $\langle x - \hat{x},y - \hat{x} \rangle = 0$. This implies that $\hat{x}$ is the orthogonal projection of $x$ onto $S$, so $$\|x-\hat{x}\|=\inf_{w \in S}\|x-w\|$$