Convergence with respect to a specific norm implied by convergence in measure

124 Views Asked by At

Let $\mathcal{M} $ be the class of measurable functions on $[a,b]$ which are finite almost everywhere(a.e).

For $f \in \mathcal{M}$ define $\rho(f)= \int_a^b \frac{\lvert f \rvert}{\lvert f \rvert +1}dx .$

Show that

$ \quad \forall \epsilon >0 \quad \lim_{n \rightarrow \infty} m(\{x \in [a,b] : |f_n(x)- f(x)| \geq \epsilon \})=0$ $\quad \Rightarrow \quad \rho(f_n-f) \rightarrow 0 $ as $n \rightarrow \infty$ a.e (pointwisely)

Actually the converse is also true, but I could manage to show that. But the above direction is more difficult. I tried to use the fact that Convergence in measure means there exist a subsequence which is convergent. But I could not go anywhere from this fact.

How do I start to prove the above statement and can you give me a path to handle the rest?

1

There are 1 best solutions below

3
On BEST ANSWER

Consider the set $E = E_n= \{ x : |f_n(x) - f(x)| \ge \epsilon \}$. Then $\rho(f_n - f) \le \int_{E} 1dm + \int_{E^C} |f_n - f| dm$.