Vector-Autoregression: Assertion on the convergence radius of a power series with square-matrices as coefficients

10 Views Asked by At

I first want to give some context to understand the setup of my question (but you may provide an answer without knowing anything about time series analysis - I guess).

Anyways: In a proof that derives a solution of a vector-autoregressive process (Theorem 2.1 of Johansen book: Likelihood-based inference in cointegrated vector auto-regressive models), I have come upon this line of reasoning: We have a $p$-dimensional stochastic process $X_t$ following the autoregressive relation:

$$X_t = \sum_{i=1}^k \Pi_i X_{t-i} + \epsilon_t,$$

wherein all the $\Pi$-terms denote square matrices. Now, the author defines the following polynomial expression $$A(z):= I- \sum_{i=1}^k \Pi_i z^i$$

The author goes on in defining a set of matrices $C_n$ which are given by some recurrence relations $$C_n:= \sum_{j=1}^{min(k,n)} C_{n-j} \Pi_j$$

Now the key claim - which I struggle to understand - is an assertion on the convergence radius of the series $C(z):=\sum_i C_i z^i$.

He dervies the following relation (the details on how to arrive at this do not matter): $$-I = \sum_{n=0}^{\infty}\sum_{j=0}^{min(k,n)}z^{n-j}C_{n-j}z^j\Pi_j=C(z)A(z)$$ He then claims that if we put $\delta:=min_i|z_i|,$ wherein the $z_i$ terms denote the (finite many) roots of the polynomial $\det(A(z))$ and if we note that $A(0)=I$, then we have shown that the series $C(z)$ converges for $|z|<\delta$. I don't see that - and here is my trouble with it:

I think the above equation has just shown that the cauchy-product of $C(z)$ and $A(z)$ converges. However, convergence of the cauchy product does in general not imply convergences of one factor $C(z)$. Hence, I would appreciate any advice on how to arrive at the stated conclusion that $C(z)$ converges for $|z|<\delta$