I currently have the following setting: (Here I mean $\Vert Z \Vert_q = (\mathbb{E}(\vert Z \vert^q))^{1/q}$ to be the $L^q$ norm)
Let $q > 1$ and consider a sequence of random variables $(Y_{n,m})_{n,m \in \mathbb{Z}}$, which satisfies both $$\tag{a}Y_{0,m}\overset{\mathcal{D}}{=}Y_{l,m+l}\text{ for all } l,m \in \mathbb{N}$$ and $$\tag{b}\sum_{j=0}^\infty \Vert Y_{0,j} \Vert_q <\infty.$$
In the first step, I'd like to infer that the sum $Z_0 = \sum_{j=0}^\infty Y_{k,j}$ converges almost surely. I guess this is not right, at least not in general without further assumptions. My thoughts here were that (a) may imply an $L^q$ convergence of the sum and that we maybe (hopefully) could infer a.s. convergence from that.
To provide some context: In my case I'm dealing with projections $P_l(\cdot)$ and a sequence of random variables $X_i$. For $Y_{l,j} = P_l(X_j)$, I know that these projections satisfy (a). And now I'm wondering if under assumption (b) the series $Z_0 = \sum_{j=0}^\infty P_0(X_j) = \sum_{j=0}^\infty Y_{0,j}$ describes a well defined random variable (for what I would need the almost sure convergence)
$$%Is this already enough to infer that for any $k \in \mathbb{N}_0$ the series $Z_k = \sum_{j=k}Y_{k,j}$ converges almost surely? And would the distributional equality then also hold for this infinite series and hence $Z_0 \overset{\mathcal{D}}{=} Z_k$ (or which is more interesting for me $\Vert Z_0\Vert_q = \Vert Z_k\Vert_q$). Or do we need additional assumptions?$$
$%\Vert X \Vert_p = (\mathbb{E}(\vert X \vert^p))^{1/p}$
Yes, $Z_0$ is well defined and finite almost surely. But perhaps it should be called $Z_k$ since it depends on $k\in \mathbb{N} = \{1, 2, 3, ...\}$.
Claim: Fix $q\geq 1$. Suppose that $Y_{0,m}$ has the same distribution as $Y_{k,k+m}$ for each $m,k \in \mathbb{N}$, and that $\sum_{m=1}^{\infty} ||Y_{0,m}||_q<\infty$. Then for each $k \in \mathbb{N}$ we have $\sum_{j=0}^{\infty} |Y_{k,j}|$ is finite almost surely, and so $\sum_{j=0}^{\infty} Y_{k,j}$ is well defined and finite almost surely.
Proof: Fix $k \in \mathbb{N}$. We have $$\sum_{j=0}^{\infty} |Y_{k,j}|=\sum_{j=0}^k |Y_{k,j}| + \sum_{m=1}^{\infty} |Y_{k,k+m}| $$ The first term on the right-hand-side is a sum of $k+1$ random variables and so is finite. It suffices to show that $$ \sum_{m=1}^{\infty} |Y_{k,k+m}| <\infty \quad \mbox{ almost surely} $$ We can take expectations of $\sum_{m=1}^{\infty} |Y_{k,k+m}|$ using Fubini-Tonelli for nonnegative random variables to get \begin{align} E\left[ \sum_{m=1}^{\infty} |Y_{k,k+m}|\right] &=\sum_{m=1}^{\infty} E[|Y_{k,k+m}|] \\ &\overset{(a)}{=}\sum_{m=1}^{\infty} E[|Y_{0,m}|]\\ &\overset{(b)}{\leq} \sum_{m=1}^{\infty} ||Y_{0,m}||_{q} \\ &< \infty \end{align} where (a) holds because $Y_{0,m}$ has the same distribution as $Y_{k,k+m}$ for each $m,k \in \mathbb{N}$; (b) holds by Jensen's inequality $E[|Y|^q]\geq E[|Y|]^q$ (since $q\geq 1$). Since the expectation is finite we know $$ P\left[\sum_{m=1}^{\infty} |Y_{k,k+m}| = \infty\right] = 0 \quad \Box$$