Sum of a list of numbers compared to the sum of the absolute value of a list of numbers in infinite limit

169 Views Asked by At

I have a list of $d$ real numbers $v=[b_1,b_2,...b_d]$. I now wish to consider a sequence which depends on this list. For $n=1$ I have the following two terms $$X_1=\frac{1}{d}\left|\sum_ib_i\right|, \quad Y_1=\frac{1}{d}\sum_i|{b_i}|.$$ Clearly $Y_1\geq X_1$. For $n=2$, I consider $$X_2=\frac{1}{d^2\times 2}\left|\sum_{i,j}(b_i+b_j)\right|=X_1, \quad Y_2=\frac{1}{d^2\times2}\sum_{i,j}|b_i+b_j|.$$ And so on. For $n=m$, \begin{gather*} X_m=\frac{1}{d^m\times m}\left|\sum_{a_1,\cdots,a_m}(b_{a_1}+b_{a_2}+\cdots+b_{a_m})\right|=X_1,\\ Y_m=\frac{1}{d^m\times m}\sum_{a_1,\cdots,a_m}|b_{a_1}+b_{a_2}+\cdots+b_{a_m}|. \end{gather*} I see numerically that as $n\rightarrow\infty$, $$Y_m\to X_m.$$ Does anyone know when this is true? Or can anyone point me to any resources that might talk about similar ideas? In other words is the following true: \begin{gather*} \lim_{m\to\infty} \frac{1}{md^m}\sum_{1\leqslant a_1,\cdots,a_m\leqslant d}|b_{a_1}+\cdots+b_{a_m}|\\ =\lim_{m\to\infty} \frac{1}{md^m}\left|\sum_{1\leqslant a_1,\cdots,a_m\leqslant d}(b_{a_1}+\cdots+b_{a_m})\right|=\frac{1}{d}\left|\sum_{i=1}^{d}b_i\right|\ ? \end{gather*}

In addition to seeing that numerically this appears to be true I have some intuition for why it may be true. There are infinitely many $b_{a_i}$ terms being summed in each term in the summation. So terms which have only a few negative $b_{a_i}$ terms will be asymptotically the same as terms with zero negative $b_{a_i}$ terms (for which the two sums coincide). Also when there are a roughly equal number of positive and negative $b_{a_i}$ terms the two sums will also be asymptotically the same. Obviously this is not at all rigorous and this is where I need help.

1

There are 1 best solutions below

0
On BEST ANSWER

$\def\N{\mathbb{N}}\def\abs#1{\left|#1\right|}\def\paren#1{\left(#1\right)}$Suppose $Z_1, Z_2, \cdots$ are i.i.d. random variables on the same probability space with $P(Z_m = b_k) = \dfrac{1}{d}$ for any $m \in \N_+$ and $1 \leqslant k \leqslant d$, then for any $m$,\begin{gather*} x_m = x_1 = \abs{ \sum_{k = 1}^d b_k · \frac{1}{d} } = |E(Z_1)|,\\ y_m = \sum_{1 \leqslant k_1, \cdots, k_m \leqslant d} \abs{ \frac{1}{m} \sum_{j = 1}^m b_{k_j} } · \frac{1}{d^m} = E\paren{ \abs{ \frac{1}{m} \sum_{j = 1}^m Z_j } }. \end{gather*} Note that $\abs{ \dfrac{1}{m} \sum\limits_{j = 1}^m Z_j } \leqslant \max\limits_{1 \leqslant k \leqslant d} |b_k|$ for any $m$, and the (strong) law of large numbers shows that$$ \frac{1}{m} \sum_{j = 1}^m Z_j \xrightarrow{\mathrm{a.s.}} E(Z_1) \quad (m → ∞), $$ thus the dominated convergence theorem implies that$$ y_m = E\paren{ \abs{ \frac{1}{m} \sum_{j = 1}^m Z_j } } → E(|E(Z_1)|) = |E(Z_1)| \quad (m → ∞). $$