Let X1, X2... be iid Bernoulli random variables with parameter 1/4, let Y1, Y2... be another sequence of iid Bernoulli random variables with parameter 3/4 and let N be a geometric random variable with parameter 1/2. Assume the Xi's, Yj's and N are all independent. Compute Covariance(Σ Xi,Σ Yi), i is from 1 to N.
Since X and Y are independent random variables, I concluded that Covariance of the two terms is 0. However, I strongly feel that the mixed distribution has a part to play here. Since both the distributions have n as a common parameter, can they still be considered independent?
This is one of those situations where it is clearer to write the limits of summation: let $$S = \sum_{i=1}^N X_i, \quad T = \sum_{i=1}^N Y_i.$$ Then they certainly have positive covariance because they both depend on $N$ in the same way. Had the upper limits been fixed, say at $5$ for $X_i$ and $11$ for $Y_i$, then the covariance is zero. But as the question is framed, knowledge of $S$, for instance, carries some information about $N$ when it is unknown, hence informs the value of $T$; e.g., observing $S = 5$ means $N \ge 5$.
To compute the covariance, it is useful to employ the law of total expectation. We first compute $$\operatorname{E}[S] = \operatorname{E}[\operatorname{E}[S \mid N]] = \operatorname{E}[N p_x] = \frac{p_x}{\theta},$$ where I have taken $p_x$ to be the Bernoulli parameter for $X_i$, and I am using the parametrization $$\Pr[N = n] = (1-\theta)^{n-1} \theta, \quad n \in \{1, 2, \ldots\},$$ so $\theta$ is the geometric distribution parameter. Similarly, $$\operatorname{E}[T] = \frac{p_y}{\theta}.$$ We now turn our attention to the computation of $$\operatorname{E}[ST] = \operatorname{E}[\operatorname{E}[ST \mid N]].$$ Note that since $N$ is given, the inner expectation is simply $$\operatorname{E}[ST \mid N] = \operatorname{E}[S \mid N]\operatorname{E}[T \mid N] = Np_x Np_y = N^2 p_x p_y.$$ Then the outer expectation with respect to $N$ is $$\operatorname{E}[ST] = p_x p_y \operatorname{E}[N^2] = \frac{2-\theta}{\theta^2} p_x p_y.$$ Consequently, $$\operatorname{Cov}[S,T] = \operatorname{E}[ST] - \operatorname{E}[S]\operatorname{E}[T] = \frac{2-\theta}{\theta^2} p_x p_y - \frac{p_x p_y}{\theta^2} = \frac{1-\theta}{\theta^2} p_x p_y.$$ It is important to understand here that this is the unconditional covariance of $S$ and $T$, rather than $$\operatorname{Cov}[S \mid N, T \mid N] = 0.$$
Had we used the alternative parametrization for $N$ $$\Pr[N = n] = (1-\theta)^n \theta, \quad n \in \{0, 1, 2, \ldots\},$$ then we would need to adjust accordingly to obtain $$\operatorname{Cov}[S,T] = \frac{2(1-\theta)^2}{\theta^2} p_x p_y.$$
Here is Mathematica code to simulate the covariance in the case $p_x = 1/4$, $p_y = 3/4$, $\theta = 1/2$ as stated in the question, with the number of simulations equal to $10^6$:
This gave me a result equal to $0.375653$, which is close to the theoretical result $3/8$. The same can be done with the alternate parametrization by removing the
+1. It just so happens that for the given choice of parameters, the covariance is again $3/8$, but when $\theta \ne 1/2$, the two covariances will not be the same.