Show that there exists 2 different best approximations

268 Views Asked by At

Let $F$ be a linear space with a norm $\lVert \cdot \rVert$ which is not strictly convex. Show that there is a function $f \in F$ and a subspace $S \subset F$, so that $f$ has different best approximations $s_1$ and $s_2$ in $S$, i.e,

$\mathscr{M}$ $(f,S)$=$inf_{s \in S} \lVert s - f\rVert =\lVert s_1 - f\rVert=\lVert s_2 - f\rVert$ where $s_1 \neq s_2$

Just in case there are different definitions of stricly convex norms, this is the definition i mean:

Let $F$ be a linear space with a norm $\lVert \cdot \rVert$. The norm is said to be strictly convex if the set $\{ f:\lVert f\rVert_{\infty}\leq 1 ,f \in F\}$ is strictly convex.

I'm a bit lost on this problem. Even with the following hint:

Hint: Choose suitable, different $f_1,f_2 \in F$ with $\lVert f_1 \rVert=\lVert f_2 \rVert=1$ and $\lVert f_1 + f_2 \rVert=2$. Consider now $f=\frac{1}{2}(f_1 + f_2)$ and $S=\{ \alpha(f_1 -f_2) |\alpha \in \Re \} \subset F$

Any help /tips appreciated. Thanks

Edit: My approach:

$\lVert s_{\alpha} - f\rVert=\lVert \alpha(f_1-f_2) - \frac{1}{2}(f_1+f_2)\rVert \leq |\alpha| \lVert f_1 - f_2\rVert + \frac{1}{2} \lVert f_1+f_2\rVert =|\alpha| \lVert f_1 - f_2\rVert +1$.

so $\lVert s_{\alpha} - f\rVert \leq |\alpha| \lVert f_1 - f_2\rVert +1$. This is minimized only when $\alpha=0$ . How can there be different best approximations then?

Another approach:

$\lVert s_{\alpha} - f\rVert=\lVert (\frac{1}{2}-\alpha)f_1 +(\frac{1}{2} + \alpha)f_2\rVert \leq |\frac{1}{2} - \alpha|+ |\frac{1}{2} + \alpha|$ which minimized to 1 for any $\alpha \in [-1/2,1/2].$ What to do from this?

1

There are 1 best solutions below

3
On BEST ANSWER

Set $ \zeta (\alpha) := \lVert s_{\alpha} - f\rVert $ where $f = \frac{1}{2} (f_1 + f_2)$ and $ s_{\alpha} = \alpha(f_1 -f_2)$ for $f_1$ and $f_2$ given in Hint. So we want to show that the following minimization problem has more than one solutions

$$ \min ~\zeta (\alpha) \quad s.t \quad \alpha \in \Bbb R $$

Clearly this problem has optimal solution, (because it is distance a point from closed set).

First not that $\zeta : \Bbb R \to \Bbb R$ is a convex function because it is composition of a convex function and affine.

Secondly: $\zeta(\alpha ) = 1$ on interval $[-\frac{1}{2} ~,~ \frac{1}{2}].$ This follows from convexity of $\zeta$ and that $\zeta(-\frac{1}{2}) =\zeta(0)= \zeta(\frac{1}{2})$ so it has to be constant on interval $[-\frac{1}{2} ~,~ \frac{1}{2}].$

Thirdly: Show that any convex function from $\Bbb R$ to $\Bbb R$ which is constant on an interval it achieves its global minimum on entire interval. (Really easy )