I was learning about how $V(X+Y)$ doesn't equal $V(X)+V(Y)$ if the two sets weren't independent, and wanted a more mathematical understanding, so I used $V(X+Y) = E[(X+Y)^2] - E[X+Y]^2$ and boiled it down to the two terms in the title. I wanted to use two sets of random numbers to see how it affected it, but how would I know the sets I've chosen are dependent - I would have to know more context behind what the two sets are about.
So ultimately, how is $E[XY] - E[X]E[Y]$ affected if $X$ and $Y$ aren't independent sets?
In general, $Cov[X,Y]=E[(X-E[X])(Y-E[Y])]=E[XY]-E[X]E[Y]$ is the covariance. And
in general, $Var[X+Y]=Var[X]+Var[Y]+2Cov[X,Y]$ is the variance.
When $X,Y$ are independent, $E[XY]=E[X]E[Y]$ so $Cov[X,Y]=0$ and hence $Var[X+Y]=Var[X]+Var[Y]$
When $Cov[X,Y]\neq 0$ then $X,Y$ are not independent.
If $Cov[X,Y]=E[(X-E[X])(Y-E[Y])]<0$ then this means, intuitively, that $(X-E[X])(Y-E[Y])$ is negative more often than it is positive, which means $X-E[X]$ and $Y-E[Y]$ are more often of opposite sign than the same sign, which further means that $X$ is more often on the opposite of its mean as compared to $Y$ than being on the same side of its mean as $Y$. This means that $X$ and $Y$ more often move in the opposite direction than in the same direction. When $X$ and $Y$ more often move in the opposite direction, their movements cancel each other out in the sum $X+Y$ which has the effect of reducing the total variance.
If $Cov[X,Y]=E[(X-E[X])(Y-E[Y])]>0$ then $X$ and $Y$ more often move in the same direction so their movements add together more often than not which has the effect of increasing the variance.