Random coefficient GARCH(1,1): Existence of solution

29 Views Asked by At

I am studying the random coefficient GARCH(1,1)-model and I am using the paper "Random coefficient GARCH(1,1) model with i.i.d. coefficients" from A. Klivecka.

We have the GARCH(1,1)-model given by

\begin{equation} r_t=\epsilon_t \sigma_t, \;\;\;\; \sigma_t^2=a_t r_{t-1}^2 + b_t \sigma_{t-1}^2 + c_t \;\; \end{equation} with i.i.d. coefficients $(a_t,b_t,c_t)$ and $\epsilon_t$, $t \in \mathbb{Z}$.

We are using the notation \begin{equation*} \xi_t := \epsilon_t^2 , \;\;\; x_t := r_t^2, \;\;\; V_t:=\sigma_t^2, \;\;\; A_t :=a_t \xi_{t-1} + b_t \end{equation*} and rewrite the model above as follows \begin{equation} x_t = \xi_t V_t , \;\; V_t = A_t V_{t-1} + c_t. \end{equation} By a solution of this we mean any sequence $\{x_t,v_t, t \in \mathbb{Z}\}$ of random variables such that the equation holds a.s. for all $t \in \mathbb{Z}$. Now we get a formal solution by iteration as follows \begin{equation} x_t = \xi_t c_t + \xi_t A_t c_{t-1} + \xi_t A_t A_{t-1} c_{t-2} + \dots = \sum_{i=0}^{\infty} \xi_t c_{t-i} \prod_{j=1}^{i} A_{t-j+1} \end{equation} I got stuck at Proposition 1, which says, that this iterated solution is unique as follows:

Proposition 1

Let one of the following conditions be satisfied: \begin{equation} -\infty \leq \mathbb{E}[\log A_0] < 0 \;\;\; \operatorname{and} \;\;\; \mathbb{E}[\max(\log c_0,0)] < \infty \end{equation} or\begin{equation} \mathbb{P}(A_0=0) > 0. \end{equation} Then the series above $x_t = \sum_{i=0}^{\infty} \xi_t c_{t-i} \prod_{j=1}^{i} A_{t-j+1}$ converges absolutely a.s. and represents a unique (bounded in probability, strictly stationary and nonanticipative) solution.

The uniqueness-part and the properties in brackets are clear for me, so I don't give the definitions of the terms used there. My problem is the part of the proof, where is shown that the sum converges a.s..

The proof is as follows: By the first condition and the independency of $\{A_t\}$ and $\{c_t\}$, by the strong law of large numbers we obtain \begin{equation*} \limsup\limits_{k\rightarrow \infty} \frac{1}{k}\left( \log c_{t-k-1} + \sum_{i=1}^{k} \log A_{t-i} \right) < 0 \;\; \operatorname{a.s.} \end{equation*} Hence, \begin{equation*} \limsup\limits_{k\rightarrow \infty} \log (A_{t-1} \dots A_{t-k} c_{t-k-1})^\frac{1}{k} < 0 \; \operatorname{or} \; \limsup\limits_{k\rightarrow \infty} (A_{t-1} \dots A_{t-k} c_{t-k-1})^\frac{1}{k} < 1 \; \operatorname{a.s.} \end{equation*} Therefore by Cauchy's root criterion, the sum converges absolutely a.s.

I don't get what is happening here. First I don't know how to use the strong law of large numbers, because this is only a property. I also don't know, how we get the log in there and how to apply Cauhcy's root theorem. Now to the second condition:

The second condition together with $\{A_t\}$ i.i.d. , implies that $\sum_{i=0}^{\infty} \xi_t c_{t-i} \prod_{j=1}^{i} A_{t-j+1}$ is a.s. finite and, therefore, also converges absolutely a.s.

How do we know, that the sum is finite? I know, that $\mathbb{P}(A_0=0) > 0$, so $\mathbb{P}(A_0=0) \in (0,1]$ and $\mathbb{P}(A_0 \neq 0) \in [0,1)$. Since the $A_t$ are i.i.d., this holds for $A_t$ instead of $A_0$. But since there is no propability $\mathbb{P}$ in the product and sum, I don't know how to use this, if it has to be used.

Some help would be really really nice!