Suppose $H$ is a random variable with pdf $f_H(h)$.
Let $X$ and $Y$ be random variables with joint pdf
$$f_{X,Y} = f_H(x) f_H(y)$$
Prove $$P(X \ge Y) = 1/2$$
Is it possible to answer the above basic probability question with a symmetry argument?
In the following advanced probability questions a symmetry argument is not only possible but also necessary because we don't know if the random variables have pdfs:
Prove symmetry of probabilities given random variables are iid and have continuous cdf
Prove independence of events given random variables are iid and have continuous cdf
Confusion with real numbers and random variables; Integration and Independence in Williams
Without symmetry argument:
$$P(X \ge Y) = \int_{\mathbb R} \int_{-\infty}^{x} f_H(x)f_H(y) dx dy$$
Let $$u = \int_{-\infty}^{x} f_H(y) dx$$
to get
$$= \int_0^1 u du = 1/2$$
(Side questions: Do we even need $H$? The fact that $X$ and $Y$ have a joint pdf that splits up into a function of $x$ and a function of $Y$ means that the joint pdf splits up into the pdfs of $X$ and $Y$ assuming they exist...do they?)
That's allowed I'm guessing because even if somehow the random variables don't have pdfs (which they always do anyway in basic probability), $X$ and $Y$ together have a joint pdf, which is all we need.
But is there a way to do this in a symmetric way?
It seems that $X$ and $Y$ are independent because the joint pdf splits up into independent (not in the probability sense) functions of $x$ and $y$:
$$f_X(x) = kf_H(x), f_Y(y) = \frac{1}{k}f_H(y)$$
Am I allowed to say the following?
$$f_X(x) = kf_H(x), f_Y(y) = \frac{1}{k}f_H(y)$$
$$\color{red}{\stackrel{?}{\to}} kf_H(x) = \frac{1}{k}f_H(x)$$
$$\to k = 1$$
$$\to f_X(x) = f_Y(x)$$
I guess that would make sense if $X$ and $Y$ had the same range or something?
Assuming that is allowed, it appears $X$ and $Y$ have the same distribution.
Is there a way to argue that
$$P(X = \max\{X,Y\}) = P(Y = \max\{X,Y\})$$
using basic probability and without integration?
We have the joint probability density function: $f_{X,Y}(x,y) := f_H(x)f_H(y)$ where $f_H$ is itself a probability density function of a continuous random variable.
Then we can find the marginals are both $f_H$. $$\begin{align}f_X(x) =&~ \int_\Bbb R f_H(x)f_H(t)\operatorname d t \\[1ex]=&~ f_H(x) \int_\Bbb R f_H(t)\operatorname d t \\[1ex] =&~ f_H(x)\\[2ex]f_Y(y) = &~ f_H(y) & \textsf{similarly} \\[2ex]\therefore~ f_{X,Y}(x,y) =& ~ f_X(x)\cdot f_Y(y) \end{align}$$
Hence these two random variables have identical, independent, and continuous distributions.