This is a follow-up question to the one previously asked here. on the geometric growth rate of generalized mean. For some $p$ and positive numbers $a=(a_1,\dots,a_n)\in\mathbb{R}^{+}$, let $M_p(a)$ denote the generalised or Hölder mean, defined as \begin{align} M_p(a) = \left(\frac1n\sum_{i=1}^n a_i^p \right)^{1/p}, && a_1,\dots,a_n\ge 0, p\in\mathbb{R}^+ \end{align} It well-known property of generalised means that for any given $a$, the value of $M_p(a)$ increases monotonically in $p$, and strictly so if $a_i$'s are not all equal. The following question attempts to quantify the growth when they are not necessarily equal:
Question: For some fixed $a$, consider the sequence $m_k := M_{2^k}(a)$ corresponding to generalised means with power series $p=1,2,4,\dots$. Is there some constant $c_n$ such that $$ c_n \left(\frac{m_{k+2}}{m_{k+1}}\right)^2\le \frac{m_{k+1}}{m_k}$$ where $0 < c_n < 1$ crucially does not depend on $a$.
Possible approach Observations:
- If the bound holds universally for $c_n$ and $k=0$, the rest of the inequalities are automatically implied by the base case on the scaled vector $a_i\mapsto 2^k a_i$.
Define $f(x) = \log M_{1/x}(a)$ we can restate the inequality in terms of $f$: \begin{align} \log c_n + 2 (f(1/4)-f(1/2)) \le f(1/2) - f(1) \end{align} We can also state this result in terms of slope: \begin{align} 1/2\log(c_n) +\rho(1/4,1/2) \le \rho(1/2,1), && \rho(a,b):=\frac{f(a)-f(b)}{b-a} \end{align} Since $f(x)$ is convex (as proven here), a sufficient condition can be obtained by using derivatives: $$ f'(1)-f'(1/4) \le O(1) $$ which can be resulted from a bound on second derivative $$ f''(x) = O(1)\qquad x\in[1/4,1] $$ From the second derivative derivation, $$ f''(x) = \frac{1}{x^3}\left[ \frac{\sum a_i^{1/x} (\log(a_i))^2}{\sum a_i^{1/x}} - \left(\frac{\sum a_i^{1/x} \log(a_i)}{\sum a_i^{1/x}} \right)^2 \right] $$ This can be re-arranged to $$ f''(x) = \frac{1}{x^3}\left[\frac{\sum_{i,j\le n} a_i^{1/x}a_j^{1/x} (\log(a_i/a_j))^2}{\sum_{i,j\le n} a_i^{1/x}a_j^{1/x}}\right] $$ we can think of the latter formula for $f''(x)$ as weighted average of $\log(a_i/a_j)^2$ terms with weights $a_i^{1/x} a_j^{1/x}$, which allows us to conclude it's not greater than log of maximum ratio between $a_i$'s: $$ f''(x)\le \max_{i,j}\log(a_i/a_j)^2 $$ However, this introduces a dependency on $a$, and hence is a weaker result, most likely because bounding average with min is a crude approximation.
Possible value $c_n$
As it is common with inequalities involving means, it is best to check the all-equal and all-except-one-zero cases:
- If $a_1=\dots=a_n$, both sides being $1$ and inequality holds trivially with $c_n=1$
- If only one value is non-zero $a_1>0,a_2=\dots=a_n=0,$ we have $M_4(a)/M_2(a) = n^{1/4},$ and $M_2(a)/M_1(a)=n^{1/2}$, and the inequality again holds with $c_n=1$.
At this point, I took a guess that $c_n=1$ will uniformly hold for all possible $a_i$'s. Thanks to the comment by @Vincenzo in the original post, this turned out to be false. Based on this counter-example, I'll try to investigate the ratios with a special case of a geometric sequence for $a_i$'s.
Consider $a_i=\alpha^{i-1}$ for $i=1,\dots,n$ and some $\alpha\in (0,1]$ to be determined later. Since the ratios are scale-invariant, starting with $a_1=1$ suffices. Note that $\alpha=1$ corresponds to the all-equal case, and $\alpha\to 0^+$ converges to the all-but-one-zero case. Thus, it is interesting if the optimal $\alpha$ (one leading to smallest $c_n$) is in between the two extremes.
We have \begin{align} M_p(a) = \left(\frac1n\sum_{i=0}^{n-1} a_i^p\right)^{1/n} = \left(\frac1n\sum_{i=0}^{n-1} (\alpha^p)^n\right)^{1/p} = (n)^{-1/p}\left(\frac{1-\alpha^{np}}{1-\alpha^p}\right)^{1/p} \end{align} We can conclude: \begin{align} \frac{M_{2p}(a)}{M_p(a)}=n^{1/2p} \left(\frac{1+\alpha^{pn}}{1+\alpha^{p}}\right)^{1/2p} \implies c_n=\frac{M_2(a)}{M_1(a)}\left(\frac{M_4(a)}{M_2(a)}\right)^{-2}=\left(\frac{(1+\alpha^{2n})^2(1+\alpha)}{(1+\alpha^{n})(1+\alpha^2)^2}\right)^{1/4} \end{align} If $n$ is sufficiently large and $\alpha<1$, by approximating $1+\alpha^{2n}$ and $1+\alpha^n$ by $1$, $c_n$ can be approximated by \begin{align}\tag{*} c_n&\approx \left(\frac{1+\alpha}{(1+\alpha^2)^2}\right)^{1/4}, \end{align} Finally, we can lower-bound the RHS by finding the arg-min $\alpha$, which turns out to be $\alpha=2/3$, and the corresponding minimum value is $RHS(*)\ge 0.94539...$ Because this value is not close to $1$, this suggests a somewhat surprising inequality \begin{align} \min_{\alpha\in(0,1)} \left(\frac{1+\alpha}{(1+\alpha^2)^2}\right)^{1/4} = 0.9453... \text{ for $\alpha=2/3$} \implies \min_{0<\alpha<1} c_n = 0.9453... - o(n) \end{align} where $o(n)$ presumably goes down exponentially in $n$ (using negative sign $-o(n)$ to emphasis that it's possibly slightly lower for finite $n$)
Clearly, all these were drawn for a very special case of a a geometric series. But th fact that an "optimal" $\alpha$ exists for a geometric sequence, suggests that anything significantly slower than or faster than exponential dependency of $a_i$ on $i$, eg. $a_i = i\alpha + \beta $ or $a_i= \alpha^{i^2} $, is unlikely to lead to a lower $c_n$.
Clearly for any positive random variable $X$we have $E(X^2)\leq E(X)^{2/3}E(X^4)^{1/3}$ by Holder inequality, so your attempts with $c_n=1$ are futile.