Let $(\Omega,\mathcal{A},P)$ be a measurable space and $X$ be a real-valued random variable on $\Omega$. I want to show that it holds: $$E(|X|)<\infty\Leftrightarrow \sum_{n\in\mathbb{N}}P(|X|>n)<\infty$$
Let's try the first direction: From definition we've got $$E(|X|)=\int |X|\;dP=\sup_{g\in\mathbb{E}_+\;:\;g=|X|}\sum_{i=1}^n\alpha_iP(A_i)<\infty$$ where $$\mathbb{E}_+:=\left\{g=\sum_{i=1}^n\alpha_i1_{A_i} : \alpha_1,\ldots,\alpha_n>0,A_1,\ldots A_n\in\mathcal{A}\text{ pairwise disjoint}\right\}$$
While the statement sounds intuitively correct to me, I'm unable to figure out how I need to proceed from here.
Why are you trying to prove this from first-principles. Are you not allowed to assume the definition of an integral? The proof is trivial from the fact that for a non-negative random variable $Y$, $E[Y]=\int_0^\infty (1-F(y))dy=\int_0^\infty P(Y>x)$, where $F(y)$ is the cdf. You can prove this fact by writing $P(Y>x)=\int_x^\infty d\mu-P(Y=x)$ where $\mu$ is the push-forward of $P$ onto $\mathbb{R}$ and then (carefully!) switching order of integration. Then the result you want follows by bounding the integral by a sum, since $P(Y>x)$ is monotonically decreasing.