Bounding the distance between $L_\infty$ and $L_2$ for a continuous function

60 Views Asked by At

Consider a set of continuous (or even differentiable) functions $f_i(x)$, all defined for $x\in [a,b]$ for $i=1\ldots,N$.

  1. Can one define a uniform constant $c$ (which may depend on $f$) such that $\|f_i\|_\infty\le c\|f_i\|_2$ for all $i=1\ldots,N$?
  2. Likewise, can one define a uniform constant $c$ (which may depend on $f$) such that $\|f_i\|_\infty\ge c\|f_i\|_2$ for all $i=1\ldots,N$?

EDITED: (1) was changed per the note that it was false (as stated)

Thanks!

2

There are 2 best solutions below

2
On

Since the question has changed I edit my answer:

The answer is now even more trivial:

  1. Notice $ ||f ||_2 = 0$ iff $f=0$ almost everywhere (a.e.) iff $||f||_\infty=0$$.

  2. For $f=0$ a.e., the inequalities hold for trivial reason, so you may assume wlog $f_i \neq 0$ a.e..

  3. So finally, you have to compare two set of strictly positive numbers, hence $c_{1} = \inf ||f_i||_{\infty}/||f_i||_2$ and $c_2 = \inf || f_i||_2 / ||f_i||_\infty$ will do the job. Be careful $c_1^{-1} \neq c_2$;)

This is the sharpest(=with smallest constants) way you can do this. If the index set is infinite, the first inequality can not hold in full generality, where as the second does. See eliya's answer.

0
On

For (2):

$\|f_i\|_2=(\int_a^b|f|^2dx)^{\frac{1}{2}}\le((\sup_{x\in[a,b]}|f(x)|^2)\int_a^b1\,dx)^\frac{1}{2}=\|f_i\|_\infty\sqrt{|b-a|}$

Thus $\|f_i\|_\infty\ge c\|f_i\|_2$, where $c=\frac{1}{\sqrt{|b-a|}}$