Rate of convegence of $\frac{E^2[X^n]}{ E[X^{n-1}] E[X^{n+1}]}$ as $n \to \infty$

237 Views Asked by At

Let $X$ be a non-negative discrete random variable such that $0\le X\le B$ and$$f(n)=\frac{E^2[X^n]}{E[X^{n-1}]E[X^{n+1}]}$$for $n \ge 1$.

I am interested in the rate of growth of $f(n)$ as $n \to \infty$. In other words, is there a way to quantify how fast $f(n)$ approaches its limit as a function of $n$ and some “property” of $X$?

A few observations:

  1. Upper bound: By Cauchy-Schwarz we have that $f(n) \le 1$.
  2. Exact limit: It can be shown that $\lim\limits_{n \to \infty} f(n)=1$. So, we are trying to refine this limit.
  3. Trivial example: If $X$ is supported on $\{0,B\}$, then $f(n)= 1$ for all $n \ge 1$.
  4. Connection to Strong Cauchy-Schwarz: Essentially, this is a question about the sharper versions of Cauchy-Schwartz inequality. Therefore, using the expression for the correction term in Cauchy-Schwartz we have $$ f(n)=1-\frac{1}{2} E \left[ \left| \frac{X^\frac{n-1}{2}}{ \sqrt{E[X^{n-1}]}} -\frac{X^\frac{n+1}{2}}{ \sqrt{E[X^{n+1}]}}\right|^2 \right]. $$

I would like to answer this with minimal assumptions, but, if needed, we can assume there is a mass at $B$.

3

There are 3 best solutions below

14
On

Try checking out Problem 7.2 in Steele's The Cauchy-Schwarz Master Class. I'll write the statement of this problem:

Show that if $f: [0, \infty) \rightarrow [0, \infty)$ is a continuous nonincreasing function which is differentiable on $(0, \infty)$, then for any pair of parameters $0 < \alpha, \beta < \infty$, the integral \begin{align*} I = \int_0^\infty x^{\alpha + \beta} f(x) dx \end{align*} satisfies the bound \begin{align*} I^2 \le \left\{1 - \left(\frac{\alpha - \beta}{\alpha + \beta + 1}\right)^2\right\}\int_0^\infty x^{2\alpha} f(x) dx\int_0^\infty x^{2\beta} f(x) dx \end{align*}

Yes, this problem technically deals with continuous distributions and assumes a nonincreasing density function $f$. But looking through the proof in the book, I fail to see why we need $f$ to be nonincreasing; it seems that $\int_0^\infty x^{2\max(\alpha, \beta)} f(x) dx < \infty$ is enough from my reading of Steele's proof. Assuming that this problem passes onto the discrete case, we can let $\alpha = \frac{1}{2}(n-1), \beta = \frac{1}{2}(n+1)$ to achieve \begin{align*} \frac{E^2[X^n]}{ E[X^{n-1}] E[X^{n+1}]} \le 1 - \frac{1}{(n+1)^2} \end{align*} Therefore, the ratio approaches 1 at a rate that is at least (EDIT: thanks to River Li, it should be "at most") quadratic in $n$.

7
On

Some idea

Let $m\ge 2$. Let $0 \le a_1 < a_2 < \cdots < a_m$ and $\mathrm{Pr}(a_1) = p_1, \mathrm{Pr}(a_2) = p_2, \cdots, \mathrm{Pr}(a_m) = p_m$ with $p_1+p_2+\cdots + p_m = 1$. We have $$f(n) = \frac{(1+A)^2}{(1+B)(1+C)}$$ where \begin{align} A = \frac{\sum_{j=1}^{m-1} p_j a_j^n}{p_m a_m^n}, \ B = \frac{\sum_{j=1}^{m-1} p_j a_j^{n-1}}{p_m a_m^{n-1}}, \ C = \frac{\sum_{j=1}^{m-1} p_j a_j^{n+1}}{p_m a_m^{n+1}}. \end{align} To proceed, we need the following result. The proof is given at the end.

Fact 1: Let $x, y, z \ge 0$ and $yz \ge x^2$. Then $$\frac{(1+x)^2}{(1+y)(1+z)} \ge 1 - (y + z - 2x).$$

Let us proceed. By Cauchy-Bunyakovsky-Schwarz inequality, we have $BC \ge A^2$. By Fact 1, we have $$1 - (B + C - 2A) \le f(n) \le 1$$ that is $$1 - \frac{\sum_{j=1}^{m-1} p_j a_j^{n-1}(a_m - a_j)^2 }{p_m a_m^{n+1}} \le f(n) \le 1. \tag{1}$$ Remark: Actually, $1 - (B + C - 2A)$ is the Taylor expansion of $\frac{(1+A)^2}{(1+B)(1+C)}$ around $A=0, B=0, C=0$.

We have \begin{align} 1 - \frac{\sum_{j=1}^{m-1} p_j a_j^{n-1}(a_m - a_j)^2 }{p_m a_m^{n+1}} &\ge 1 - \frac{\sum_{j=1}^{m-1} p_j a_{m-1}^{n-1}(a_m - 0)^2 }{p_m a_m^{n+1}}\\ &= 1 - \frac{1-p_m}{p_m} \left(\frac{a_{m-1}}{a_m}\right)^{n-1}. \end{align} Thus, we have $$1 - \frac{1-p_m}{p_m} \left(\frac{a_{m-1}}{a_m}\right)^{n-1} \le f(n) \le 1. \tag{2}$$

$\phantom{2}$

Proof of Fact 1: After clearing the denominators, it suffices to prove that $$(y+z + yz - x)^2 - yz (1+y)(1+z) \ge 0.$$ From $yz \ge x^2$, we have $y + z \ge 2\sqrt{yz} \ge 2x$ and thus $0 \le y + z + yz - \sqrt{yz} \le y + z + yz - x$. It suffices to prove that $$(y+z + yz - \sqrt{yz})^2 - yz (1+y)(1+z) \ge 0$$ or $$(y+z + yz)(\sqrt{y} - \sqrt{z})^2 \ge 0.$$ It is true. We are done.

1
On

$\DeclareMathOperator{\supp}{supp}\def\peq{\mathrel{\phantom{=}}{}}$If $X$ only assumes finitely many values but at least $2$ distinct values, suppose$$ \supp(X) = \{x_1 > \cdots > x_N \geqslant 0\}, \quad p_k := P(X = x_k) > 0 \quad (1 \leqslant k \leqslant N), $$ then there is the following estimate:

\begin{gather*} \frac{E(X^{n + 1}) E(X^{n - 1})}{(E(X^n))^2} - 1 \sim \frac{p_2}{p_1} \left( 1 - \frac{x_2}{x_1} \right)^2 \left( \frac{x_2}{x_1} \right)^{n - 1} \quad (n → ∞). \tag{1} \end{gather*}

Proof: Since $E(X^m) = \sum\limits_{k = 1}^N p_k x_k^m$ for any $m > 0$, then\begin{gather*} E(X^n) = \sum_{k = 1}^N p_k x_k^n = p_1 x_1^n \left( 1 + \sum_{k = 2}^N \frac{p_k}{p_1} \left( \frac{x_k}{x_1} \right)^n \right) \sim p_1 x_1^n \quad (n → ∞). \tag{2} \end{gather*} Note that\begin{align*} &\peq E(X^{n + 1}) E(X^{n - 1}) - (E(X^n))^2\\ &= \left( \sum_{k = 1}^N p_k x_k^{n + 1} \right) \left( \sum_{k = 1}^N p_k x_k^{n - 1} \right) - \left( \sum_{k = 1}^N p_k x_k^n \right)^2\\ &= \biggl( p_1^2 x_1^{2n} + p_1 p_2 (x_1^{n + 1} x_2^{n - 1} + x_1^{n - 1} x_2^{n + 1}) + \sum_{\substack{1 \leqslant k, j \leqslant N\\(k, j) \not\in \{(1, 1), (1, 2), (2, 1)\}}} p_k p_j x_k^{n + 1} x_j^{n - 1} \biggr)\\ &\peq - \biggl( p_1^2 x_1^{2n} + 2p_1 p_2 x_1^n x_2^n + \sum_{\substack{1 \leqslant k, j \leqslant N\\(k, j) \not\in \{(1, 1), (1, 2), (2, 1)\}}} p_k p_j x_k^n x_j^n \biggr)\\ &= p_1 p_2 (x_1 - x_2)^2 x_1^{n - 1} x_2^{n - 1} + \sum_{\substack{1 \leqslant k, j \leqslant N\\(k, j) \not\in \{(1, 1), (1, 2), (2, 1)\}}} p_k p_j (x_k^{n + 1} x_j^{n - 1} - x_k^n x_j^n). \end{align*} If $x_2 = 0$, then $N = 2$ and$$ \sum_{\substack{1 \leqslant k, j \leqslant N\\(k, j) \not\in \{(1, 1), (1, 2), (2, 1)\}}} p_k p_j (x_k^{n + 1} x_j^{n - 1} - x_k^n x_j^n) = p_2^2 (x_2^{n + 1} x_2^{n - 1} - x_2^{2n}) = 0. $$ Otherwise for $1 \leqslant k, j \leqslant N$ with $(k, j) \not\in \{(1, 1), (1, 2), (2, 1)\}$,\begin{gather*} \lim_{n → ∞} \frac{x_k^{n + 1} x_j^{n - 1}}{x_1^{n - 1} x_2^{n - 1}} = x_k^2 \lim_{n → ∞} \left( \frac{x_k x_j}{x_1 x_2} \right)^{n - 1} = 0,\\ \lim_{n → ∞} \frac{x_k^n x_j^n}{x_1^{n - 1} x_2^{n - 1}} = x_k x_j \lim_{n → ∞} \left( \frac{x_k x_j}{x_1 x_2} \right)^{n - 1} = 0. \end{gather*} Therefore$$ E(X^{n + 1}) E(X^{n - 1}) - (E(X^n))^2 \sim p_1 p_2 (x_1 - x_2)^2 x_1^{n - 1} x_2^{n - 1} \quad (n → ∞), $$ and combining with (2) yields (1).