derivative of the $L^p$-norm with respect to $p$

1k Views Asked by At

Let $\Omega$ be a probability space (or a finite measure space) and $f : \Omega \to \mathbb{R}$ be a positive function of $L^p(\Omega)$ with $f>0$. We suppose that $\|f\|_{L^1(\Omega)}=1$.

  1. What is the derivative $\frac{d}{d p}|_{p=1} \|f\|_p$ at the point 1 of the function $p \mapsto \|f\|_p$ defined on $[1,+\infty)$ ?

My feeling is that the answer is maybe $-\int_{\Omega} f\log f$.

  1. Same question with $\ell^p$ instead of $L^p(\Omega)$ with a sequence $(a_k)$ of $\ell^1$ with $a_k >0$ for any $k$ and $\|a\|_{\ell^1}=1$.

I believe that there is a difference between the two cases since if $q<p$ we have $L^p \subset L^q$ in the first case and $\ell^q \subset \ell^p$ in the second case.

2

There are 2 best solutions below

0
On BEST ANSWER

We need to assume that $$\int_{\Omega}\log(f(\omega))f(\omega)~\mathrm d \mu(\omega) < \infty.$$ To see that this does not follow from the stated assumptions, Let $\Omega := [e,\infty)$ with Lebesgue measure and set $f(x) := \frac{1}{x \log(x)^2}.$ The arguments below will show that this assumption is necessary for the derivative in question to be finite.

Let $g(p) = \|f\|_p^p.$ For any $y>0$, the function $p \mapsto y^p$ is convex on $(0,\infty)$, and therefore the function $h \mapsto \frac{y^{p+h} - y^{h}}{h}$ is increasing in $h$ for any $p>0$. Fixing $p>0$ and using the monotone convergence theorem, we obtain \begin{align*} \lim_{h \searrow 0} \frac{g(p+h)-g(h)}{h} &= \lim_{h\searrow 0} \int_{\Omega} \frac{f(\omega)^{p+h} - f(\omega)^{p}}{h}~\mathrm d \mu(\omega) \\ &=\int_{\Omega} \lim_{h\searrow 0}\frac{f(\omega)^{p+h} - f(\omega)^{p}}{h}~\mathrm d \mu(\omega) \\ &= \int_{\Omega} \log(f(\omega)) f(\omega)^p ~\mathrm{d} \mu(\omega). \end{align*} Applying the same reasoning with $h \nearrow 0$, we obtain that $g$ is differentiable with $$ g'(p) = \int_{\Omega} \log(f(\omega)) f(\omega)^p ~\mathrm{d} \mu(\omega),$$ for all $p > 0$ where the right-hand side is finite. Note that by assumption, this holds for $p = 1$, and as the map $p \mapsto \int_{\Omega} \log(f(\omega)) f(\omega)^p ~\mathrm{d} \mu(\omega)$ is continuous by virtue of the monotone convergence theorem, the right-hand side above is finite for all $p>0$. Hence, $g$ is differentiable on $(0,\infty)$, and by the chain rule $h(p) = g(p)^{1/p} = \|f\|_p$ is differentiable on $(0,\infty)$ as well, and we have $$ g'(p) = \frac{d}{dp} (h(p)^p) = p \cdot h(p)^{p-1} \cdot h'(p) + h(p)^p \cdot \log(h(p)).$$ By assumption we have $h(1) =1$ and therefore we obtain $h'(1) = g'(1)$ which yields $$h'(1) = \int_{\Omega} \log(f(w)) f(w) ~\mathrm{d}\mu(\omega). $$ The above deliberations hold for any measure $\mu$, so this covers both cases.

0
On

Maybe this helps for point [1].

By definition for $f>0$, $||f||_p=(\int_{\Omega} (f(\omega))^{p}d\mu(\omega))^{1/p}$. Introducing $p_f(x)$ as the density of $f$, this can be rewritten, via a change of variables, as an integral over the real line:

$||f||_p=(\int_{0}^{+\infty} dx \quad p_f(x)x^{p})^{1/p}$ [1]

and now one can use standard real analysis techniques to get the derivative of an integral dependent on a parameter.

Taking the log:

$\frac{\partial}{\partial p}\ln ||f||_p=\frac{1}{||f||_p}\frac{\partial}{\partial p}||f||_p$ [2]

And than using [1]:

$\frac{\partial}{\partial p}\ln ||f||_p=-\frac{1}{p^2}\ln \int_{0}^{+\infty} dx \quad p_f(x)x^{p}+\frac{1}{p}\int_{0}^{+\infty} dx \quad p_f(x)x^{p}\ln(x)$ [3]

Using [3], [2] and evaluating at $p=1$ using the hypothesis:

$\frac{\partial}{\partial p}||f||_p|_{p=1}=\int_{0}^{+\infty} dx \quad p_f(x)x \ln(x)=\int_{\Omega} f(\omega)\ln(f(\omega))d\mu(\omega)$

This approach of course requires that a density exists and one should also justify better some passages to be more formal but I think it leads to the right result.