Say we have X~N(10, 100). It seems to hold that X+X~N(20, 200), however, if we multiply X with constant we have to multiply the variance with the square of the constant. Take for example 2, then we have 2X~N(20, 100 * 2^2) and thus 2X~N(20, 400).
Doesn't X+X and 2X denote exactly the same thing? I feel like there's a difference between the two that I do not understand.
Thanks
Here is a simpler example. Suppose $A$ and $B$ are each independent uniform random variables between $0$ and $1$. We could have $A = 0.25$ and $B = 0.8$, for example, as an outcome. So their sum is $$A+B = 1.05.$$ But now $2A = 0.5$. The variable $2A$ is uniform on $[0,2]$, because the outcome of $A$ is uniform on $[0,1]$ and you are just scaling up the result by $2$. But $A+B$ is not uniform on $[0,2]$, although its support is on this interval. Intuitively, this is because in order for $A+B$ to be "close to" $2$, both $A$ and $B$ have to be close to $2$. But there are many more ways for $A+B$ to be "close to" $1$, because either $A$ can be large and $B$ can be small, or vice versa.
To see this explicitly, we can compare this to a discrete distribution--rolling two fair dice numbered from $1$ to $6$. How many ways are there to get a sum of $12$? There is only one way: $(6,6)$. But how many ways are there to get a sum of $7$? There are six ways: $$(1,6), (2,5), (3,4), (4,3), (5,2), (6,1).$$
So now that we understand that the distribution of $A+B$ is not the same as $2A$, it is not too difficult to see that their variances will also be different.
To bring our conversation back to the normal distribution, we can see that you have a misapprehension here. $X+X$ is not an appropriate way to describe a random variable that represents the sum of two independent but identically distributed (IID) normal random variables. In other words, if by $X+X$ you mean to say, "draw two realizations from a normal distribution with mean $\mu$ and variance $\sigma^2$," then this is not the correct notation. Instead, you should write $$X_1 + X_2,$$ where $$X_i \sim \operatorname{Normal}(\mu,\sigma^2), \quad i = 1, 2, \ldots.$$ Then the sum of these IID normal random variables is also normal: $$X_1 + X_2 \sim \operatorname{Normal}(2\mu, 2\sigma^2).$$ But the random variable $2X_1$ does not represent drawing two normal random variables. It means drawing one random variable and multiplying it by $2$. And as we explained with our examples above, this is a different distribution than the sum of independent normal random variables; while it is normal, it has a different variance.