Let's say I have a set $\Omega$ with a probability $P$ on its subsets (the fact that it could be undefined on some subset is not important here, let's not think about that). I have a random variable $(X,Y): \Omega \to \mathbb{R}^2$, and a function $\phi: \mathbb{R}^2 \to \mathbb{R}^2$. Then, I define the probability induced by the random variable as $P_{X,Y}(A)=P((X,Y)^{-1}(A))$, with $A \subseteq \mathbb{R}^2$. A well known result tells me that (given $\omega=(x,y) \in \Omega$): $$\int_{\Omega} \phi((X,Y)(\omega))dP=\int_{\mathbb{R}^2} \phi(x) dP_{X,Y}$$
Consequently, taking $\phi =x+y$, I have that $E[(X,Y)]= \int_{\Omega} (X,Y)(\omega) dP=\sum_{x_j,y_i \in \Omega} (X,Y)(x_i,y_j) p_{x,y}(x_i,y_j)$ if $\Omega$ is countable, where $p_{x,y}$ is the probability that $(X,Y)(\omega)=(x_i,y_j)$.
What I don't get is why it is that $E[X+Y]=E[X]+E[Y]$ when $X$ and $Y$ are integrable. For instance, if I had a random variable $X+Y$ with $X$ and $Y$ that assume all positive integer values, I don't get why $$E[X+Y]=\sum_{n=1}^{\infty} n p_{x,y}(\{X+Y=n\})=\sum_{i=1}^{\infty} i p_x(\{X=i\}) + \sum_{j=1}^{\infty} j p_y (\{Y=j\})=E[X]+E[Y]$$
The integral interpretation, also, doesn't help me in figuring this out. I wonder, also, if there is a criterion such that we know when integrating $\phi$ is additive with respect to the random variables. For instance, the expected value is found taking $\phi=x+y$, and we have additivity.
$$\sum_{n}n\mathsf{P}\left(X+Y=n\right)=\sum_{n}\sum_{i+j=n}\left(i+j\right)\mathsf{P}\left(X=i\wedge Y=j\right)=$$$$\sum i\sum_{j}\mathsf{P}\left(X=i\wedge Y=j\right)+\sum j\sum_{i}\mathsf{P}\left(X=i\wedge Y=j\right)=$$$$\sum i\mathsf{P}\left(X=i\right)+\sum j\mathsf{P}\left(Y=j\right)$$