Let $x=(x_i)$ be a probability measure on $\{1,\ldots,n\}$. Suppose $1<p<\infty$. The Rényi entropy of $x$ is $$ H^p(x)=\frac{1}{1-p}\log \sum_{i} x_i^p. $$
Does there exist a formula for $H^p(x)$ using a derivative and a norm ($\ell_{p,q}$, Orlicz,...) of $x$?
Remark: I know that for the classical entropy $H(x)=-\sum_{i}\log(x_i) x_i$, we have $$ H(x)=\left. \frac{d}{dp}\right|_{p=1}\vert\vert x \vert\vert^p_{\ell_p} $$