I am unable to understand how to correctly identify time series processes through their autocovariance functions (acvf).
For example I have an acvf $$\gamma(h) = \begin{cases} 4-|h|, & |h|\leq4,\\ 0, &|h|>4.\end{cases} $$ Apparently this is a moving average lag 3 process (MA(3)). Any directions on how to solve this type of problem?
I am not too sure about autocovariance functions of processes that might not ne zero-mean processes, but if a wide-sense stationary (a.k.a. weakly stationary) stochastic process $\{X_t\}$has autocorrelation function $E[X_tX_{t+h}]$ given by $$E[X_tX_{t+h}] = \gamma(h) = \begin{cases} 4-|h|, & |h|\leq 4,\\ 0, &|h|>4,\end{cases} $$ then we can immediately say that the process indeed has zero mean (and so its autocorrelation function coincides with its autocovariance function). Furthermore, since $\gamma(h) = g\star \tilde{g}\big\vert_h$ where $\tilde{g}$ is the time-reverse of $g$, that is, $\tilde{g}(t) = g(-t)$, with $g(t) = \begin{cases}1, & 0 \leq t \leq 4,\\ 0 & \text{otherwise},\end{cases}$ for continuous-time processes, and $g(t) = \begin{cases}1, & t =0,1,2,3,\\ 0 & \text{otherwise},\end{cases}$ for discrete-time processes (a.k.a. time series), we can write that
$$X_t = \int_{t-4}^t W_{\tau} \,\mathrm d\tau ~~ \text{or} ~~ X_t = \sum_{\tau=0}^3 W_{t-\tau}$$
according as the process is a continuous-time process or a discrete-time process and $\{W_t\}$ is a white noise process with autocorrelation function $\delta_t$, the Dirac delta or Kronecker delta. Thus, for a discrete-time process, the $\{W_t\}$'s are a sequence of independent zero-mean unit-variance random variables, and $X_t$ is the sum (not the average) of the current and the past three values of the noise driving the system. If we truly want the average of these values to get the moving average, we should scale $X_t$ by a factor of $\frac 14$.
Don't believe a word of all this?
With $X_t = W_t + W_{t-1}+ W_{t-2} + W_{t-3}$, we have that \begin{align} \gamma(0) &= E[X_tX_t]\\ &= E[(W_t + W_{t-1}+ W_{t-2} + W_{t-3})(W_t + W_{t-1}+ W_{t-2} + W_{t-3})]\\ &= E[W_t^2 + W_{t-1}^2+ W_{t-2}^2 + W_{t-3}^2]\\ &= 4,\\ \gamma(1) &= E[X_tX_{t+1}]\\ &= E[(W_t + W_{t-1}+ W_{t-2} + W_{t-3})(W_{t+1} + W_{t}+ W_{t-1} + W_{t-2})]\\ &= E[W_t^2 + W_{t-1}^2+ W_{t-2}^2]\\ &= 3\\ \gamma(2) &= E[X_tX_{t+2}]\\ &= E[(W_t + W_{t-1}+ W_{t-2} + W_{t-3})(W_{t+2} + W_{t+1}+ W_{t} + W_{t-1})]\\ &= E[W_t^2 + W_{t-1}^2]\\ &= 2\\ \gamma(3) &= E[X_tX_{t+3}]\\ &= E[(W_t + W_{t-1}+ W_{t-2} + W_{t-3})(W_{t+3} + W_{t+2}+ W_{t+1} + W_{t}]\\ &= E[W_t^2 ]\\ &= 1\\ \gamma(4) &= E[X_tX_{t+4}]\\ &= E[(W_t + W_{t-1}+ W_{t-2} + W_{t-3})(W_{t+4} + W_{t+3}+ W_{t+2} + W_{t+1}]\\ &= 0\\ \end{align} exactly as it should be.