Let $\Omega$ be a probability space (or a finite measure space) and $f : \Omega \to \mathbb{R}$ be a positive function of $L^p(\Omega)$ with $f>0$. We suppose that $\|f\|_{L^1(\Omega)}=1$.
- What is the derivative $\frac{d}{d p}|_{p=1} \|f\|_p$ at the point 1 of the function $p \mapsto \|f\|_p$ defined on $[1,+\infty)$ ?
My feeling is that the answer is maybe $-\int_{\Omega} f\log f$.
- Same question with $\ell^p$ instead of $L^p(\Omega)$ with a sequence $(a_k)$ of $\ell^1$ with $a_k >0$ for any $k$ and $\|a\|_{\ell^1}=1$.
I believe that there is a difference between the two cases since if $q<p$ we have $L^p \subset L^q$ in the first case and $\ell^q \subset \ell^p$ in the second case.
We need to assume that $$\int_{\Omega}\log(f(\omega))f(\omega)~\mathrm d \mu(\omega) < \infty.$$ To see that this does not follow from the stated assumptions, Let $\Omega := [e,\infty)$ with Lebesgue measure and set $f(x) := \frac{1}{x \log(x)^2}.$ The arguments below will show that this assumption is necessary for the derivative in question to be finite.
Let $g(p) = \|f\|_p^p.$ For any $y>0$, the function $p \mapsto y^p$ is convex on $(0,\infty)$, and therefore the function $h \mapsto \frac{y^{p+h} - y^{h}}{h}$ is increasing in $h$ for any $p>0$. Fixing $p>0$ and using the monotone convergence theorem, we obtain \begin{align*} \lim_{h \searrow 0} \frac{g(p+h)-g(h)}{h} &= \lim_{h\searrow 0} \int_{\Omega} \frac{f(\omega)^{p+h} - f(\omega)^{p}}{h}~\mathrm d \mu(\omega) \\ &=\int_{\Omega} \lim_{h\searrow 0}\frac{f(\omega)^{p+h} - f(\omega)^{p}}{h}~\mathrm d \mu(\omega) \\ &= \int_{\Omega} \log(f(\omega)) f(\omega)^p ~\mathrm{d} \mu(\omega). \end{align*} Applying the same reasoning with $h \nearrow 0$, we obtain that $g$ is differentiable with $$ g'(p) = \int_{\Omega} \log(f(\omega)) f(\omega)^p ~\mathrm{d} \mu(\omega),$$ for all $p > 0$ where the right-hand side is finite. Note that by assumption, this holds for $p = 1$, and as the map $p \mapsto \int_{\Omega} \log(f(\omega)) f(\omega)^p ~\mathrm{d} \mu(\omega)$ is continuous by virtue of the monotone convergence theorem, the right-hand side above is finite for all $p>0$. Hence, $g$ is differentiable on $(0,\infty)$, and by the chain rule $h(p) = g(p)^{1/p} = \|f\|_p$ is differentiable on $(0,\infty)$ as well, and we have $$ g'(p) = \frac{d}{dp} (h(p)^p) = p \cdot h(p)^{p-1} \cdot h'(p) + h(p)^p \cdot \log(h(p)).$$ By assumption we have $h(1) =1$ and therefore we obtain $h'(1) = g'(1)$ which yields $$h'(1) = \int_{\Omega} \log(f(w)) f(w) ~\mathrm{d}\mu(\omega). $$ The above deliberations hold for any measure $\mu$, so this covers both cases.