Lower bound for the arithmetic mean based on quadratic mean

366 Views Asked by At

It is known that the arithmetic mean of a list of non-negative real numbers is less than or equal to the quadratic mean (root mean square) of the same list: $$\frac{x_1+x_2+\cdots+x_n}{n} \le \sqrt{\frac{x_1^2+x_2^2+\cdots+x_n^2}{n}}$$ (More about mean inequalities)

My question is that given the number $n$ (number of elements of the list) and the quadratic mean ($QM$) of that list can we find lower bound for the arithmetic mean ($AM$) of that list?

More precisely, what is the greatest lower bound for the $AM$ that we can find? For example, it is obvious that $AM \ge 0$. Also, it is easy to show that $AM \ge \dfrac{QM}{\sqrt{n}}$: $$(x_1+x_2+\cdots+x_n)^2 \ge x_1^2+x_2^2+\cdots+x_n^2 \quad \Rightarrow\\ x_1+x_2+\cdots+x_n \ge \sqrt{x_1^2+x_2^2+\cdots+x_n^2} \quad \Rightarrow\\ \frac{x_1+x_2+\cdots+x_n}{n} \ge \frac{1}{\sqrt{n}} \sqrt{\frac{x_1^2+x_2^2+\cdots+x_n^2}{n}} \quad \Rightarrow\\ AM \ge \frac{QM}{\sqrt{n}}.$$ Is there some better (greater) lower bound for the $AM$ if we know $n$ and $QM$?

3

There are 3 best solutions below

0
On BEST ANSWER

The bound $AM \ge \frac{QM}{\sqrt{n}}$ is tight. The equality is achieved, for instance, when one of $x_i$’s equals $1$ and the remaining equal $0$.

3
On

Let we need to find a maximal $C(n)$, for which the inequality $$\left(\sum_{k=1}^nx_k\right)^2\geq C\sum_{k=1}^nx_k^2$$ is true for any non-negatives $x_k$.

Let $x_2=x_3=...=x_n=0$.

Thus, $C\leq1,$ which says $C=1$ is a best bound.

There is the following inequality:

For any $x_i\geq0$, $n\geq2$ prove that: $$\sum_{i=1}^nx_i\leq\sqrt{\frac{\sum\limits_{1\leq i<j\leq n}x_ix_j}{\binom{n}{2}}}+(n-1)\sqrt{\frac{\sum\limits_{i=1}^nx_i^2}{n}}.$$

It's stronger because $$\frac{\sum\limits_{i=1}^nx_i}{n}\geq\sqrt{\frac{\sum\limits_{1\leq i<j\leq n}x_ix_j}{\binom{n}{2}}}.$$ For specific values of $n$ we can get much more stronger inequalities.

3
On

$\def\vec{\boldsymbol}\def\R{\mathbb{R}}$Usually by saying an inequality is tight, it means that some particular constant in it cannot be improved. E.g. in @AlexRavsky's answer, “$\text{AM} \geqslant \dfrac{\text{QM}}{\sqrt{n}}$ is tight” means that the inequality is true and the constant $\dfrac{1}{\sqrt{n}}$ cannot be replaced by a larger one, so what they proved is the following proposition:

$$\min_{\substack{\vec{x} \in \R_{\geqslant 0}^n\\\vec{x} ≠ \vec{0}}} \frac{\|\vec{x}\|_1}{\|\vec{x}\|_2} = 1,$$

where $\|\vec{x}\|_a = \left(\sum\limits_{k = 1}^n |x_k|^a \right)^{\frac{1}{a}}$. This, however, does not exclude the possibility that there exists a non-linear function $f$ of QM such that $\text{AM} \geqslant f(\text{QM})$ and $f(t) \geqslant \dfrac{t}{\sqrt{n}}$ for $t \geqslant 0$.

The following reasoning deals with the general scenario, but the result coincides with the linear bound due to the fact that AM and QM are homogeneous polynomials of $x_1, \cdots, x_n$ of the same order.

Proposition: For any $a > 1$ and $t \geqslant 0$,$$ \min_{\substack{\vec{x} \in \R_{\geqslant 0}^n\\\|\vec{x}\|_a = t}} \|\vec{x}\|_1 = t, $$ so the best function $f_a: [0, +∞) → \R$ satisfying $\|\vec{x}\|_1 \geqslant f_a(\|\vec{x}\|_a)$ for all $x \in \R_{\geqslant 0}^n$ is $f_a(t) = t$.

Proof: A lemma is needed.

Lemma: $(x + y)^a \geqslant x^a + y^a$ for $x, y \geqslant 0$.

Proof: Define $g(t) = (t + 1)^a - t^a$ for $t \geqslant 0$. Since $g'(t) = a ((t + 1)^{a - 1} - t^{a - 1}) \geqslant 0$, then $g(t) \geqslant g(0) = 1$ for $t \geqslant 0$.

Now for $x, y > 0$,$$ g\left( \frac{x}{y} \right) = \left( \frac{x}{y} + 1 \right)^a - \left( \frac{x}{y} \right)^a \geqslant 1 \Longrightarrow (x + y)^a \geqslant x^a + y^a. $$ And the inequality is obviously true if either $x = 0$ or $y = 0$. $\square$

Now return to the proposition. First, the minimum is attainable since $\{\vec{x} \in \R_{\geqslant 0}^n \mid \|\vec{x}\|_a = t\}$ is tight in $\R^n$, i.e. closed and bounded.

On the one hand, taking $\vec{x} = (t, 0, \cdots, 0)$ shows that $\min\limits_{\substack{\vec{x} \in \R_{\geqslant 0}^n\\\|\vec{x}\|_a = t}} \|\vec{x}\|_1 \leqslant t$. On the other hand, the lemma implies that for $\vec{x} \in \R_{\geqslant 0}^n$ with $\|\vec{x}\|_a = t$,$$ \|\vec{x}\|_1^a = \left( \sum_{k = 1}^n x_k \right)^a \geqslant \left( \sum_{k = 1}^{n - 1} x_k \right)^a + x_n^a \geqslant \cdots \geqslant \sum_{k = 1}^n x_k^a = \|\vec{x}\|_a^a = t^a, $$ thus $\|\vec{x}\|_1 \geqslant t$ and $\min\limits_{\substack{\vec{x} \in \R_{\geqslant 0}^n\\\|\vec{x}\|_a = t}} \|\vec{x}\|_1 \geqslant t$. Therefore, $\min\limits_{\substack{\vec{x} \in \R_{\geqslant 0}^n\\\|\vec{x}\|_a = t}} \|\vec{x}\|_1 = t$. $\square$


As an example in which a non-linear function is the best lower bound, consider the inequality$$ x^2 + y^2 + 2 \geqslant f(x + y).\quad \forall (x, y) \in \R^2 $$ An obvious linear choice for $f$ is $f(t) = 2t$ since$$ (x^2 + y^2 + 2) - 2(x + y) = (x - 1)^2 + (y - 1)^2 \geqslant 0, $$ but the best bound is $f(t) = \dfrac{t^2}{2} + 2 \geqslant 2t$ because for any $t \in \R$,$$ (x^2 + y^2 + 2)\bigr|_{x + y = t} = x^2 + (t - x)^2 + 2 = 2\left( x - \frac{t}{2} \right)^2 + \frac{t^2}{2} + 2 \geqslant \frac{t^2}{2} + 2, $$ and the equality is attained when $x = y = \dfrac{t}{2}$.