I'm reading Introductory Time Series with R in a section where the correlogram is discussed.
I'm confused by one of the statements:
If $\rho_k = 0$, the sampling distribution of $r_k$ is approximately normal, with a mean of $-1/n$ and a variance of $1/n$.
Earlier in the book the following are defined:
$$\rho_k = \frac{\gamma_k}{\sigma^2}$$
$$\gamma_k = E[(x_t - \mu)(x_{t+k} - \mu)]$$
My confusion is what $\rho_k = 0$ means. Should I read this as:
- all the values of $\rho$ for each '$t$ and $t+k$' pair are zero? Or,
- the sum of all values of $\rho$ for each '$t$ and $t+k$' pair are zero? Or,
- something else?
I think the first one is correct. I.e. $\rho_1 = 0, \rho_2 = 0, ..., \rho_n = 0$
Simple answers preferred over complex ones.
Under the null hypothesis there is no auto-correlation of order $k$ in the series, i.e., $\rho_k = 0$.