Assume that the continuous random variable $X$ has a distribution (in a closed form expression) with differential entropy $h(X)$.
Q) Then, is it true for any continuous distribution that the log-standard deviation is additively separable from the entropy expression s.t. $$h(X) = \log \sigma + C,$$ where $\sigma$ is the standard deviation of distribution and $C$ may be a function of parameters of distribution? If my claim is false for some continuous distributions, then for what family of distributions exactly does the claim hold?
I emphasize that when $\sigma$ is dependent on a shape parameter of the distribution, the shape parameter must fixed so that varying $\sigma$ only varies the "spread'' of distribution. Let me illustrate with a few examples.
For a Gaussian random variable $X$, we see \begin{align} h(X)=\log(\sigma \sqrt{2 \pi e}) = \log \sigma + C, \end{align} where $C = \log \sqrt{2 \pi e}$.
For an Erlang random variable $X$, we see \begin{align} h(X) &= \frac{1}{\psi(k)}(1-k) + \log \frac{\Gamma(k)}{\lambda} + k, \\ &= \log \frac{\Gamma(k) \sigma}{\sqrt{k}}+ \frac{1}{\psi(k)}(1-k) + k \\ &= \log { \sigma}+ C, \end{align} where $k$ is the shape parameter and $C = \log \frac{\Gamma(k)}{\sqrt{k}}+ \frac{1}{\psi(k)}(1-k) + k $. Notice how I used the fact that $k$, a shape parameter, is fixed but $\lambda$ may be not when $\sigma$ is varied.
If $\sigma$ only affects the "spread" of $X$, as you require, then you can write $X = \mu + \sigma Y$, where the distribution of $Y$ does not depend on $\mu$ or $\sigma$. In that case you have indeed $h(X) = h(\sigma Y) = h(Y) + \log \sigma$.