Expectation and variance of GARCH($p$, $q$)

108 Views Asked by At

I was going through the properties of GARCH($p$, $q$) and found the following problem in Brockwell (2016):

Suppose that $\left\{Z_t\right\}$ is a causal stationary GARCH $(p, q)$ process $Z_t=\sqrt{h_t} e_t$, where $\left\{e_t\right\} \sim \operatorname{IID}(0,1), \sum_{i=1}^p a_i+\sum_{j=1}^q B_j<1$ and $$ h_t=\alpha_0+\alpha_1 Z_{t-1}^2+\cdots+\alpha_p Z_{t-p}^2+\beta_1 h_{t-1}+\cdots+\beta_q h_{t-q} \text {. } $$ a. Show that $E\left(Z_t^2 \mid Z_{t-1}^2, Z_{t-2}^2, \ldots\right)=h_t$.

b. Show that the squared process $\left\{Z_t^2\right\}$ is an $\operatorname{ARMA}(m, q)$ process satisfying the equations $$ \begin{aligned} Z_t^2= & \alpha_0+\left(\alpha_1+\beta_1\right) Z_{t-1}^2+\cdots+\left(\alpha_m+\beta_m\right) Z_{t-m}^2 \\ & +U_t-\beta_1 U_{t-1}-\cdots-\beta_q U_{t-q}, \end{aligned} $$ where $m=\max \{p, q\}, \alpha_j=0$ for $j>p, \beta_j=0$ for $j>q$, and $U_t=Z_t^2-h_t$ is white noise if $E Z_t^4<\infty$.

c. For $p \geq 1$, show that the conditional variance process $\left\{h_t\right\}$ is an $\operatorname{ARMA}(m, p-1)$ process satisfying the equations $$ \begin{gathered} h_t=\alpha_0+\left(\alpha_1+\beta_1\right) h_{t-1}+\cdots+\left(\alpha_m+\beta_m\right) h_{t-m} \\ +V_t+\alpha_1^* V_{t-1}+\cdots+\alpha_p^* V_{t-p-1}, \end{gathered} $$ where $V_t=\alpha_1^{-1} U_{t-1}$ and $\alpha_j^*=\alpha_{j+1} / \alpha_1$ for $j=1, \ldots, p-1$

Let's reconsider the first part of the question.

Given that $Z_t = \sqrt{h_t} e_t$, we can write $Z_t^2 = h_t e_t^2$.

To show that $E\left(Z_t^2 \mid Z_{t-1}^2, Z_{t-2}^2, \ldots\right)=h_t$, we need to evaluate the expectation of $Z_t^2$ conditioning on the past values of the process $Z_t^2$.

The conditional expectation $E\left(Z_t^2 \mid Z_{t-1}^2, Z_{t-2}^2, \ldots\right) = E\left(h_t e_t^2 \mid Z_{t-1}^2, Z_{t-2}^2, \ldots\right)$.

Given the independence and identical distribution of $\{e_t\}$, we know that $e_t^2$ is independent of $Z_{t-1}^2, Z_{t-2}^2, \ldots$, so $E\left(e_t^2\right) = 1$.

But $h_t$ is a deterministic function of $Z_{t-1}^2, Z_{t-2}^2, \ldots$, thus its expectation given $Z_{t-1}^2, Z_{t-2}^2, \ldots$ is just itself.

So, we obtain $E\left(Z_t^2 \mid Z_{t-1}^2, Z_{t-2}^2, \ldots\right) = h_t E\left(e_t^2\right) = h_t \cdot 1 = h_t$.

Does this complete the proof for part $a)$? For the remaining parts, I am still unclear on which properties should I use to solve.

1

There are 1 best solutions below

1
On BEST ANSWER

You approach for part (a) seems correct to me.

I think a found a proof for part (b):

Letting $U_t = Z_t^2 - h_t$, we first observe that $$ U_t = h_t e_t^2 - h_t = h_t (e_t^2 - 1) \implies e_t^2 = 1+ \frac{U_t}{h_t} .$$ We thus have $$ Z_t^2 = h_t e_t^2 = h_t \left(1+ \frac{U_t}{h_t} \right) = h_t + U_t$$ Now, substituting $h_{t-i} = Z_{t-i}^2 - U_{t-1}$ for $i = 1, \dotsc, q$ in the expression for $h_t$, we obtain $$ Z_t^2 = \alpha_0 + \sum_{i = 1}^p \alpha_i Z_{t-i}^2 + \sum_{i = 1}^q \beta_i (Z_{t-i}^2 - U_{t-i}) + U_t \\ =\alpha_0 + \sum_{i = 1}^{\max\{p,q\}} \alpha_i Z_{t-i}^2 - \sum_{i = 1}^q \beta_i U_{t-i} + U_t,$$ where in the last expression we let $\alpha_j = 0$ for $j > p$ and $\beta_j = 0$ for $j > q$. Now you can verify that the $U_t$ are uncorrelated (but not independent), and independent with $Z_s^2$ for every $s < t$. As a result, $Z_t^2$ is an $\text{ARMA}(m,q)$ process with $m = \max \{p,q\}$.

I have to think a little bit harder for part (c). I might come back to it tomorrow if I have some time.