I have a random variable $X$ with mean $E[X$] and a random variable $Y$ with mean $E[Y]$. Both $X$ and $Y$ have the same variance, but not necessarily the same mean and they are also not necessarily independent. I have a third random variable $Z$ that is a mixture random variable taking the value of $X$ with probability $\frac12$ and of $Y$ with probability $\frac12$. I want to calculate the variance of $Z$.
I'm doing it this way but am not sure that I could split the expectation this way using expected value:
$$\textrm{Var}(Z) = E[Z^2] - E[Z]^2 = (\frac14E[X^2] + \frac12 E[XY] + \frac14E[Y^2]) - (\frac12E[X] + \frac12E[Y])^2 = \frac14 \textrm{Var}(X) + \frac14 \textrm{Var}(Y) + \frac12 \textrm{Cov}(X,Y)$$
If this is indeed correct, am I also correct in saying that $Z$ can only have the same variance as $X$ and $Y$ if $X = Y$ or if $X$ and $Y$ have the same mean and are independent?
$\mathbb E[Z^2]$ does not equal to $\mathbb E\left(\frac{X+Y}{2}\right)^2$ as follows from your second equality.
If I understand correctly, the choice what variable we shoud take for $Z$ is independent of $X$ and $Y$. In this case we can rewrite $Z$ as $$ Z=\beta X+(1-\beta)Y $$ where $\beta$ is a Bernoulli random variable independent of $X,Y$ and $\mathbb P(\beta=0)=\mathbb P(\beta=1)=\frac12$.
Then $$Z^2=\beta^2 X^2+(1-\beta)^2Y^2+2\beta(1-\beta)XY=\beta X^2+(1-\beta)Y^2$$ since $\beta^2=\beta$, $(1-\beta)^2=(1-\beta)$ and $\beta(1-\beta)=0$.
And $$ \mathbb E [Z^2] = \mathbb E\left(\beta X^2+(1-\beta)Y^2\right) = \frac12\mathbb E[X^2]+\frac12\mathbb E[Y^2] $$ $$ = \frac12\left(\textrm{Var}(X)+\mathbb E[X]^2\right)+\frac12\left(\textrm{Var}(Y)+\mathbb E[Y]^2\right). $$ Substitute it into $\textrm{Var}(Z)$: $$ \textrm{Var}(Z) = \frac12\left(\textrm{Var}(X)+\mathbb E[X]^2\right)+\frac12\left(\textrm{Var}(Y)+\mathbb E[Y]^2\right) - \left(\frac12E[X] + \frac12E[Y]\right)^2 $$ $$ =\frac12\textrm{Var}(X)+\frac12\textrm{Var}(Y) + \frac14\left(E[X]-E[Y]\right)^2. $$ So covariance of $X$ and $Y$ cannot appear since $XY$ cannot arises anywhere: there are disjoint events which lead to $X$ and $Y$ and they cannot meet together.
From the last value we can say that if variances of $X$ and $Y$ are equal then $Z$ can have the same variance iff expectations coincide too.