Given an $ARMA(p,q)$ model: $X_t = \phi_1 X_{t-1} + ... + \phi_pX_{t-p} + Z_t + \theta_1Z_{t-1}+...+\theta_qZ_{t-q}$ where $Z_t$ is white noise.
Derive the following:
$Var(X_t) = \gamma(0) = \frac{1+\theta_1^2+2\phi\theta_1}{1-\phi_1^2}\sigma^2$.
Unsure where to begin!
What you are asked to derive is the variance of a specific order of an $\operatorname{ARMA}$ process and not for the general $(p,q)$ setting!
Assuming you have the values of $p$ and $q$, we need a few assumptions to prove this. First, with the backshift polynomials we have the representation \begin{equation} \phi(B)X_t = \theta(B) Z_t, \quad t \in \mathbb Z \end{equation} for which we assume that $\phi(z)$ and $\theta(z)$ have no common roots for $z\in \mathbb C$. Further, we need to assume your $\operatorname{ARMA}(p,q)$ process is causal, which means it has a linear representation, \begin{equation} X_t = \sum_{j = 0}^\infty \psi_j Z_{t-j} \end{equation} where $$ \sum_{j = 0}^\infty \lvert \psi_j\rvert < \infty. $$ Then the process has a linear representation and $$ \psi(z) = \sum_{j = 0}^\infty \psi_j z^j = \frac{\theta(z)}{\phi(z)}\quad \lvert z\rvert \leq 1. $$ It can then be shown that $$ \operatorname{Var}(X_t) = \gamma(0) = \sigma^2 \sum_{j = 0}^\infty\psi_j^2 $$ which you hopefully have the proof of, because that is not a trivial result!
The idea is then to find the coefficients of $\psi(z)$ by comparing coefficients of $$ \phi(z)\psi(z) = \theta(z) $$ to find the coefficients $\psi_0,\psi_1,\psi_2,\ldots$, which should, in your case, prove to be a geometric series when summing them.