Suppose I have $X \sim \text{Uniform}(0,1)$ and $Y \sim \text{Uniform}(0,1)$ As we all know $X+Y$ is a triangular distribution. What of $X+X$? Surely this is uniformly distributed on the interval $[0,2]$. The former case is a function of two random variables, where the latter case seems to be a function of one, masquerading as a function of two.
But, does $X=Y$, i.e. are they are the same mathematical object? If the are the same, then $X+Y = X+X$. Conversely if $X \not=Y$ then where exactly do they differ? They are defined on the same probability space and are the same function, every component of their formal mathematical definitions is the same; they must be the same.
Perhaps one might argue that by calling one $X$ and one $Y$ we are differentiating them. But this seems to me as confusing a variable name with the object it refers to. If we were to say $X=\{1,2,3\}$ and $Y=\{1,2,3\}$ we would be happy to assert that $X=Y$. What makes this different when I am referring to probability spaces and functions defined on them?
This problem arose practically in implementing a purely functional probabilistic programming language where I can say:
let x = uniform(0,1)
let y = uniform(0,1)
x + y
Where x and y are not samples drawn from distributions but refer to the whole distribution itself. These objects must be equivalent, their names x and y do not differentiate them. If I wanted to differentiate them it seems like I would have to have some other argument to give it some identity.
let x = uniform(0,1,'x')
let y = uniform(0,1,'y')
x + y
Is it the case that when we say $X \sim \text{Uniform}(0,1)$ and $Y \sim \text{Uniform}(0,1)$ we are secretly giving identity as in the code above, or there is some other explanation, or even that $X+X = X+Y$?
For $X+Y$ to have a triangular distribution, you need $X$ and $Y$ to be independent. And $X$ is not independent from itself.