I'm working through a problem regarding expected values in Markov chains, and at some point it says:
Recall from probability that if $X$ is a positive integer valued random variable, then $\mathbb{E}[X] = 1 + \sum^\infty_{k=1}\mathbb{P}(X > k)$.
I know that by definition $\mathbb{E}[X] = \sum_a a\mathbb{P}(X = a)$, but I can't see how the above equality follows from this, nor am I sure if this is how to approach the problem.

$aP(X = a) = P(X=a) + \dots + P(X = a)$ ($a$ times)
Now assuming $X$ takes only integer values $\ge 1$, you have
$$1 = P(X > 0) = P(X = 1) + \color{red}{ P(X = 2)} + \color{green}{P(X=3) }\dots$$ $$P(X > 1) = \color{red}{P(X = 2)} + \color{green}{P(X = 3) }+ \dots $$ $$P( X > 2) = \color{green}{P(X = 3)} + P(X = 4) + \dots $$
and so on.
Summing everything on the LHS you get $ 1 + \sum_{k=1}^\infty P(X > k)$, on the RHS you find $\sum_{a = 1}^\infty aP(X = a)$
Hence $$E[X] = \sum_{a = 1}^\infty aP(X = a) = 1 + \sum_{k=1}^\infty P(X > k)$$