For the random variable Y constructed as follows:
$$Y = \sum_{i=1}^{T} X_i \ $$ where $T$~Poisson$(\lambda)$ with $\lambda > 0,\space$ and$\space$ {${{X_i}}$}$^T_{i=1}$ is an independent and identically distributed sample of size T from a Poisson distribution with mean $\theta$.
I have calculated the method of moments estimator for $ {\hat\theta} $ when $\lambda$ is known to be $\frac{\bar{Y}}{{\lambda}}$.
I now need to derive a method of moments estimator for (θ, λ) based on the sample mean and variance assuming $\lambda$ is unknown.
I understand that I need to use the law of total variance however I'm not really sure what to do.
First we use that the sum of $k$ Poisson($\lambda$) has distribution Poisson($k \lambda$). With this, we get that $Y|T\sim$Poisson($T\lambda$) and therefore $\mathbb{E}[Y|T]=T \lambda$.
Now, let us compute the variance. As you mentioned, we can use the law of total variance $$ Var[Y] = Var[\mathbb{E}[Y|T]] + \mathbb{E} [Var[Y|T]]. $$ Notice that $Var[\mathbb{E}[Y|T]] = Var[\lambda T]=\lambda^2 Var[T]= \lambda^2 \theta$.
Now, for the second part, $$ \mathbb{E} \left[Var[Y|T]\right] = \mathbb{E} \left[\mathbb{E}[Y^2|T]\right] - 2 \mathbb{E} \left[\mathbb{E}[Y|T]^2\right]+E[T^2]. $$ It should be clear how to compute the second and third terms by using the distribution of $T$ and $\mathbb{E}[Y|T]$. For the first term, notice that \begin{align*} \mathbb{E}[Y^2|T=t] &= \sum_{i,j=1}^t \mathbb{E} [X_i X_j] \\&= \sum_{1\le i\neq j\le t} \mathbb{E} [X_i]\mathbb {E}[ X_j] +\sum_{i=1}^t \mathbb{E} [X_i^2] \\&= \sum_{1\le i\neq j\le t} \mathbb{E} [X_i]^2 +\sum_{i=1}^t \mathbb{E} [X_i^2] \\&= t(t-1)(\lambda^2) +t (\lambda^2+\lambda), \end{align*} where we used that $X_i's$ are i.i.d in the second and third identity. Therefore, we have $\mathbb{E}[Y^2|T]= T(T-1)\lambda^2 + T(\lambda^2+\lambda)$. Plugin everything back should give you the result. Let me know if you need further help.