Problem with understanding the expected value $E(y_t)$ in definition of a stationary time series.

12 Views Asked by At

I am trying to understand the conditions for time series $\{y_t\}$ to be stationary, i.e.:

  • $E(y_t)=\mu$ is constant for all $t >0,$
  • $Var(y_t) = V$ is constant for all $t >0$ and
  • $cov(y_t, y_s)$ depends only on $|t-s|$ and not $t$ itself.

I came across a youtube video by ritvikmath explaining this concept and it shows this example:

enter image description here

In 3:00 he says that the mean of plotted series is constant and I don't understand it completely.

If there were multiple time series, expected value of each variable $Y_t$ could be estimated as mean of $y_t$s. But how is it done if there is only one series? For each variable $Y_t$ we have only one datapoint, I don't understand how it's possible to know $E(y_t)$ here. Same for variance.