Suppose $X$ and $Y$ are independent random variables. The mgf for $X$ is $M_{X}(t) = (1-t)^{-1}, \ t<1$. The mgf for $Y$ is $M_{Y}(t) = (1-2t)^{-1}, \ t< 0.5$. Two other random variables $U$ and $V$ are defined by $U = \frac{1}{2}(X+Y)$ and $V = \frac{1}{2}(X-Y)$. Calculate $\text{Cov}(U,V)$.
So $\text{Cov}(U,V) = E(UV)-E(U)E(V)$. This is the same thing as $\text{Cov}(\frac{1}{2}(X+Y), \frac{1}{2}(X-Y))$. Expanding out, we get
$\text{Cov}(\frac{1}{2}(X+Y), \frac{1}{2}(X-Y)) = \frac{1}{4} \text{Var}(X) -\frac{1}{4} \text{Cov}(X,Y)+\frac{1}{4} \text{Cov}(Y,X)-\frac{1}{4} \text{Var}(Y)$ which equals $\frac{1}{4}(\text{Var}(X)-\text{Var}(Y))$.
Is there an easier way to obtain this without going through all this work? Could one use the fact $U+V = X$ and $U-V = Y$?
The method you have is pretty short already. I'd have pulled out the $\frac{1}{4}$ at the start, but otherwise the straightforward way is a lot faster than trying to think up something trickier. Sometimes straightforward is best.
The moment generating functions indicate that $X$ is exponential with parameter $1$ so its mean is $1$ and its variance is also $1$. $Y$ is exponential with parameter $2$, so its mean is $2$ and variance is $4$. Those are both special cases of gamma distributed, so I was hoping the theorem about the sum of gammas being gamma would help, but the wrong parameter matches.
We can use $M_{aX}(t) = M_{X}(at)$ to find
$M_{\frac{1}{2}X} = \frac{1}{1-\frac{1}{2}t}$, $M_{\frac{1}{2}Y} = \frac{1}{1-t}$, $M_{\frac{-1}{2}Y} = \frac{1}{1+t}$,
giving $M_{U} = \frac{1}{(1-\frac{1}{2}t)(1-t)}$ and $M_{V} = \frac{1}{(1-\frac{1}{2})(1+t)}$
but $U$ and $V$ are not independent so that approach doesn't seem to be fruitful either.
Really I think the answer to your question is that no, there isn't any faster or easier or more elegant way to do it than what you did or what was given in the comment.