I have been studying linear processes in time series of the following form:
$X_t= \sum_{j=0}^{\infty}c_j\epsilon_{t-j}$
In this case, the following implication (about the covariance function) holds:
$ \sum_{j=1}^{\infty} j|c_i| < \infty \Rightarrow \sum_{j=0}^{\infty} j|\gamma_i| < \infty $
However, I have some trouble understanding the proof which runs us follows:
$ \sum_{j=0}^{\infty} j^p|\gamma_i| \leq \sigma^2 \sum_{i=0}^{\infty} |c_i|\sum_{j=0}^{\infty} j^p|c_{i+j}| \leq \sigma^2 \sum_{i=0}^{\infty} |c_i|\sum_{j=0}^{\infty} (i+j)^p |c_{i+j}| \leq \sigma^2 \sum_{i=0}^{\infty} |c_i|\sum_{j=0}^{\infty} j^p |c_{j}| < \infty$
I do not understand the following part (the last inequality):
$ \sigma^2 \sum_{i=0}^{\infty} |c_i|\sum_{j=0}^{\infty} (i+j)^p |c_{i+j}| \leq \sigma^2 \sum_{i=0}^{\infty} |c_i|\sum_{j=0}^{\infty} j^p |c_{j}| < \infty$
Can anyone please help me out and explain why it holds?
For a fixed $i$, the substitution $k=i+j$ gives $$\sum_{j=0}^{\infty} (i+j)^p |c_{i+j}|=\sum_{k=i}^\infty k^p\left\lvert c_k\right\rvert\leqslant \sum_{k=0}^\infty k^p\left\lvert c_k\right\rvert.$$