Relation between autocorrelation and mean of a stochastic process

713 Views Asked by At

It is said that if the autocorrelation approaches zero as $\tau$ tends to zero, then the mean of the stochastic process is also zero.

I am having trouble understanding the above concept. Say we have a discrete process as defined below:

$y_t = a_0 + a_1y_{t-1} + \epsilon_t$ for $|a_1| < 1$.

In this case, assuming stationarity, the mean is nonzero whereas the autocorrelation tends to zero.

So how come the statement above is true!! I am really confused here, kindly help.

1

There are 1 best solutions below

0
On

Consider a stochastic process $X_t,~t\in T$ with mean $\mu(t)$ and autocorrelation $R(s,t)$ satisfying some conditions. Let $\tilde{\mu}:T\rightarrow \mathbb{R}$ be an arbitrary function. Now, define new stochastic process $Y$: \begin{equation} Y_t = X_t - \mu(t) + \tilde{\mu}(t),~\forall t\in T \end{equation} From linearity of expectation, the mean of $Y$ is $\tilde{\mu}(t)$. However, covariance and variance are not changed by adding constants, wherefore the correlation between $(X_t,X_s)$ equals the correlation between $(Y_t,Y_s)$ for any $(t,s)$. Thus, the autocorrelation of $Y$ is equal to the autocorrelation of $X$.

Hence, there cannot be an implication that the mean is 0 based only on properties of the autocorrelation of the process. (Besides the pathologic case that the assumed property of the autocorrelation is impossible, in which case such an assumption would imply anything).