Let $X,Y$ be two dependent random variables.
Let $\epsilon_1,\epsilon_2$ be two independent draws from a normal distribution.
Are $X+\epsilon_1$, $Y+\epsilon_2$ independent? Or at least uncorrelated? Or maybe it depends?
Ex: Suppose twice between 10-11am we ask someone the time of day, and that the second time we ask is 1 minute after the first time we ask.
Let $X$ be the RV for their first answer, and $Y$ the RV for their second answer, and $x,y$ denote the respective realizations
(and suppose only minutes matter, not seconds, so t˙ere are only 60 possible values for $x$)
Then $X,Y$ are dependent because $y=x+1$ always.
Suppose, however, that they report the time with some error, drawn from a normal distribution. That is, we observe $Z_1=x+\epsilon_1$, $Z_2=y+\epsilon_2$, where $\epsilon_i$ is the error the ith time we ask.
Are $Z_1,Z_2$ independent? or uncorrelated?
My brief thoughts:
I try to think about this, but if the error, $\epsilon_i$ is small, then $Z_2\sim Z_1+1$, which makes me think $Z_1,Z_2$ are correlated, but if the error is big, then they would look uncorrelated? (but even then their distributions I think are dependent? because $X$ is effectively "centering" $Z_2$?)
However... I'm pretty sure if we observed $Z_3=5+\epsilon_1$, $Z_4=5+\epsilon_2$, these would be independent... which makes me confused... why adding a constant is different from adding a realization from a random variable...
To tackle your last question first:
Adding a constant to two independent error terms means they are still independent.
Adding the same random variable to two independent error terms means the results are dependent.
Two different, correlated random variables added to two independent error terms will be correlated.
Now for a calculation: start with
$$Cor(Z_1, Z_2) = E[ Z_1Z_2] - E[ Z_1 ]E[ Z_2]$$
expand it all out, use that $Cor(X,Y) = 1$, and you will get a nonzero answer.