Let $(X,Y)$ be a random vector.
How does one show $P(x,y) > 0$ implies $P(y) >0$ and $\sum_y P(x,y) = P(x)$ for $X = x, Y = y$ using the axioms of probability ?
(In the continuous case the $\sum$ should be substituted with $\int_Y$).
Intuitively this is true, since if $P(x) = 0$ then $(X=x,Y=y)$ cannot occur. Also the sum over all $Y=y$ for some $X = x$ should give the probability of $X=x$, since this probability of $X=x$ is split among the $y$'s ? However how does one show that a distribution must follow these intuitively ideas?
By the Law of total probability, you have that $$P(X=x)=\sum_{y}P(X=x|Y=y)P(Y=y)$$ Now using the formula for the conditional probability, i.e. that $$P(X|Y=y)=\frac{P(X=x,Y=y)}{P(Y=y)}$$ you can conclude that $$P(X=x)=\sum_{y}P(X=x,Y=y)$$ and by abusing notation $$P(x)=\sum_{y}P(x,y)$$
The intuition behind this is the same as in Law of total probability.