Let's say we have a sequence of $n$ IID random variables, $I_i$. Let's define a new random variable which is their sum:
$$S = \sum_i I_i$$
To calculate the variance of $S$ we can say -
$$V(S) = \sum_i V(I_i) = nV(I_1)$$
Or we can also say that $S$ has the same distribution as $nI_1$
So, we should have $V(S) = V(nI_1) = n^2V(I_1)$.
This is of course, in direct contradiction to what we got above. What am I missing?
$S$ does not have the same distribution as $nI_1.$ In fact, you have proven this by showing they have different variances. As an example, if $I_1$ can take the values $0$ and $1$ (i.e. it is a Bernoulli variable), then $2I_1$ can take the values $0$ and $2.$ However $I_1+I_2$ can take the value $0$ (if both are zero), $1$ (if one is zero and the other is one), or $2$ (if both are one).