Does one of these conditions on a sequence imply the other one?

50 Views Asked by At

Let ${(r_n)}_{n \geq 0}$ be a sequence of integers $\geq 2$. Set $q_n=\prod_{i=0}^{n-1} r_i$ (agreeing with $q_0=1$). I want to know whether one of these two conditions implies the other one (I think the answer is no but I don't find counter-examples). The first condition is $$(\Delta)\colon\quad \sum \frac{r_n \log r_n}{q_{n+1}} < \infty.$$ The second condition is $$(\Theta)\colon\quad \limsup_{\epsilon\to 0}\,\epsilon \sum_{i=0}^{k(\epsilon)}r_i\log r_i \to 0$$ where $k(\epsilon) = \max\{k \mid q_{k+1}^{-1} \geq \epsilon\}$.

For example:

  • Any bounded sequence $(r_n)$ satisfies both conditions. This is obvious for $(\Delta)$, and not difficult for $(\Theta)$.

  • The case $r_n=n+2$ satisfies both conditions. This is not difficult for $(\Delta)$, and this can be shown for $(\Theta)$ by using the inequalities $k! \geq k^{\frac{k}{2}}$ and $\sum_{i=1}^n k \log k \leq {(n \log n)}^2$.

Incidentally, if none of these conditions implies the other one, I would be interested in a simplified formulation of condition $\Delta\cap\Theta$.

2

There are 2 best solutions below

1
On BEST ANSWER

$\Delta \implies \Theta$

Let $\varepsilon > 0$ and take such $N_1$ that $$\sum_{i=n+1}^m \frac{r_i \log r_i}{q_{i+1}} < \varepsilon$$ whenever $N_1 \leqslant n \leqslant m$. There is such $N_2$ that $$\frac{ \displaystyle \sum_{i=0}^{N_1} r_i \log r_i }{ q_{m+1} } < \varepsilon$$ if $m \geqslant N_2$. Let $N = \max \{ N_1, N_2 \}$.

For $h < \frac{1}{q_{N+1}}$ let $m = \max \left\{ m : h \leqslant \frac{1}{q_{m+1}} \right\}$ so $m \geqslant N$ and $$h \cdot \sum_{i=0}^m r_i \log r_i \leqslant \frac{\displaystyle \sum_{i=0}^m r_i \log r_i}{q_{m+1}} = \frac{\displaystyle \sum_{i=0}^{N_1} r_i \log r_i}{q_{m+1}} + \frac{\displaystyle \sum_{i=N_1+1}^{m} r_i \log r_i}{q_{m+1}} \leqslant \\[1ex] \leqslant \frac{\displaystyle \sum_{i=0}^{N_1} r_i \log r_i}{q_{m+1}} + \sum_{i=N_1+1}^m \frac{r_i \log r_i}{q_{i+i}} < 2 \varepsilon$$

and we're done.

I'll try to think whether the other implication holds.

0
On

@Adayah's answer provides the proof of this lemma:

Lemma: Let ${(u_n)}_{n \geq 0}$ and ${(v_n)}_{n \geq 0}$ be two sequences of positive numbers such that $v_n \searrow 0$. If $\sum u_n v_n < \infty$ then $\epsilon \sum_{i=0}^{n(\epsilon)} u_i \to 0$ when $\epsilon \to 0^+$, where $n(\epsilon)=\min\{n \mid v_{n+1} < \epsilon\}$.

Surely, @Adayah's proof of this lemma is the more straightforward one. As a comment too long for a comment, I just want to explain how to derive this lemma from a general result about integrability (finite expectation) of random variables. We can derive it from the two following facts:

1) For any positive random variable $X$, $$E(X) = \int_0^1 F^{-1}(u)\mathrm{d}u,$$ where $F$ is the cumulative distribution function of $X$ and $F^{-1}$ is its left-continuous inverse (the inverse when it exists). Consequently, $$E(X) < \infty \implies \epsilon F^{-1}(1- \epsilon) \to_{\epsilon\to 0^+} 0.$$ And consequently, for any positive and right-continuous increasing function $H$, $$E(H(X)) < \infty \implies \epsilon H(F^{-1}(1- \epsilon)) \to_{\epsilon\to 0^+} 0.$$ (to prove this consequence, consider the right-continuous inverse of $H$ to check that $H\circ F^{-1}$ is the left-continous inverse of $H(X)$).

2) When $X$ is a discrete random variable, $$\sum_{n \geq 0} u_n\Pr(X>n) = E(\sum_{k=0}^X u_k).$$ (continuous version: $E(U(X))=\int u(x) \Pr(X>x)\mathrm{d}x$)

Then we can prove the lemma by firstly assuming without loss of generality that $v_0\leq 1$ and taking $X$ such that $\Pr(X>n)=v_n$. Then fact 2 gives $\sum u_n v_n = E(H(X))$ with $H(x) = \sum_{k=0}^x u_k$. Then it suffices to check that $n(\epsilon)=F^{-1}(1-\epsilon)$ and apply fact 1.