Let $p_j\ge0,\ j=1,2,3,\dots,$ and suppose $\sum_j p_j=1.$ Is there a simple proof that $$\sum_{j=1}^\infty{jp_j}\tag{1}$$ converges? My question arises from the answer to this question. Consider a Markov chain with state space $\{1,2,3,\dots\}.$ If the chain is is state $1,$ it transitions to state $j$ with probability $p_j.$ If it is in state $j>1$ then it always transitions to state $j-1$. The chain is irreducible and aperiodic, so it has a unique stationary distribution. The sum $(1)$ arises in computing the stationary probabilities, so it must converge.
I've been trying unsuccessfully to find a more direct proof. There isn't any way to apply standard tests (root test, ratio test, Gauss's test) and I haven't any other ideas. (It's equivalent to the statement that if N is a random variable that takes positive integer values, then $E(N)$ exists, but I don't see how that helps. In fact, my intuition would be that this statement is false.)
EDIT
It has been amply shown that the statement is false. I would like to know the error in the linked question.
Not all aperiodic, irreducible Markov processes have a stationary distribution. This is only true for finite state spaces. For infinite spaces, you need the process to be positive recurrent, meaning the expected time to return to a state is finite. Here, starting from $1$, the expected time to return to $1$ is $\sum jp_j$. Therefore, your proof goes in circles; in order for the process to have a stationary distribution, you need $\sum jp_j<\infty$, and in order to prove that, you use that the process has a stationary distribution.
When the list $(p_1,p_2,\dots)$ has too fat a tail, the process will never settle, and instead become more diffuse as time goes on.