Gaussian vectors and covariance matrix.

339 Views Asked by At

The following is a part of a question I was given in stochastic processes course. It goes like this - I am given a series of gaussian iid random variables $\{V_i\}_{i=1}^N$ , the variable $X_0 \sim N[0,\sigma^2]$ which is statistically independent in $\{V_i\}_{i=1}^N$ , $\beta\neq 0 $ which is a deterministic parameter and the following recursive formula which defines a group of random variables : $X_{i+1}=\beta X_i+V_i$ . Now , in the last part we are told that $ \{X_i\}_{i=0}^N $ are identically distributed with $\mu=0, \sigma^2$ and we need to determine the i,j component of the 2nd moment matrix (which is defined by $E[\underline{X}\cdot \underline{X}^T]$). My guess is since $\mu=0$ then this is actually the covariance matrix of the vector $\underline{X}$ ,and since all components are identically distributed then each element at index ij of the matrix represents $\text{Cov}[X_i,X_j]=\text{Var}[X_i]=\sigma^2$ . Am I right or am I totally missing something? Thanks for the help, Mark.

1

There are 1 best solutions below

0
On BEST ANSWER

The results follow from the following identities, using the structure of the process $(V_i)$. Note that $$X_i=\sum_{n\geqslant1}\beta^{n-1}V_{i-n} $$ hence $$ \mathrm{var}(X)=\sum_{n\geqslant1}\beta^{2(n-1)}\mathrm{var}(V)=\frac{\mathrm{var}(V)}{1-\beta^2}$$ and that, for every $j\geqslant0$, $$X_i=\beta^jX_{i-j}+\sum_{n=1}^j\beta^{n-1}V_{i-n} $$ hence $$ \mathrm{cov}(X_i,X_{i-j})=\beta^j\mathrm{var}(X). $$