Metric projection onto nonnegative sequences

73 Views Asked by At

I need to find the metric projection onto non-negative sequences in $\ell^2$. Intuitively I'm thinking each element in the sequence should be sent to the maximum between it and zero, since that should minimize the distance. How can I formally demonstrate this is (if it is) the correct solution using the definition (or characterizations) of the 'best apprximation'?

2

There are 2 best solutions below

0
On BEST ANSWER

Your claim is correct.

Suppose $a\in \ell^2$, $b \in \ell^2$ is the vector you conjectured was closest to $a$ ($b_i = \max(a_i, 0)$ for all $i$), and $c$ is the actual closest nonnegative vector to $a$ in $\ell^2$. I'll show that $c = b$.

Pick any $i$ for which $b_i \ne c_i$.

Case 1: $a_i \ge 0$. Then $b_i = a_i$, and the $i$th term of the distance $d(a, b)$, i.e., $(b_i - a_i)^2$ is zero. Now let $c'$ be $c$ with $c_i$ replaced by $b_i$. Then $$ d(c, a)^2 - d(c', a)^2 = (c_i - a_i)^2 - (b_i - a_i)^2 = (c_i - a_i)^2 > 0. $$ Hence $c$ wasn't the closest point (because $c'$ is closer.

Case 2: $a_i < 0$. Then $b_i = 0$ and $c_i > 0$; a similar proof shows that $d(c, a)$ is not a minimum by replacing the $i$th entry with $0$.

0
On

Similar to John Hughes's answer, but with a different presentation.

The subset of nonnegative sequences is a closed convex subset of the Hilbert space $\ell^2$, thus for each $a\in \ell^2$ there exists a unique projection of $a$.

Let $a\in \ell^2$ be fixed and note that for any $b\in \ell^2$, $\|b-a \|^2 = \sum_{k=0}^\infty (b_k-a_k)^2$.

For $\alpha\in \mathbb R$, define $f_\alpha:[0,\infty)\to \mathbb R, \beta\mapsto (\beta-\alpha)^2$ and note that the minimizer of $f_\alpha$ is $\max(0,\alpha)$.

As a result,

$$\sum_{k=0}^\infty (b_k-a_k)^2 = \sum_{k=0}^\infty f_{a_k}(b_k)\geq \sum_{k=0}^\infty f_{a_k}\big(\max(0,a_k)\big).$$

This lower bound (which is independent of $b$) is attained when $\forall k, b_k = \max(0,a_k)$, thus proving that such $b$ is the projection of $a$.