Maximum likelihood estimate of Gaussian parameters (1st-order Markov Chain)

200 Views Asked by At

Let us assume the availability of a time series $x_1, \ldots, x_N$ (where $x_i \in \mathbb{R}$, $0 \leq x_i \leq 1$).

If we assume each variable to be independent of all previous observations except the most recent, we obtain a first-order Markov chain. Under this model, the joint distribution $p(x_1, \ldots, x_n)$ reads:

$$ p(x_1, \ldots, x_n) = p(x_1) \prod_{n=2}^N p(x_n\,|\,x_{n-1}). $$

Moreover, assume that the observations are generated from a Gaussian distribution $\mathcal{N}(\mu, \sigma^2)$.

I would like to find the maximum likelihood estimate of the parameters $\mu$ and $\sigma^2$. To this end, let us compute the log-likelihood function as follows:

$$ \ln p(x_1, \ldots, x_n) = \ln\left\{\mathcal{N}(x_1\,|\, \mu, \sigma^2)\right\} + \sum_{n=2}^N \ln\left\{p(x_n\,|\,x_{n-1})\right\} $$

In order to complete the computation, I need to find the distribution of $(x_n\,|\,x_{n-1})$. Intuitively, I would assume $(x_n\,|\,x_{n-1})$ to distribute normally with mean $\bar{\mu}$ and variance $\bar{\sigma}^2$, both depending on the value of the observation $x_{n-1}$.

How can I explicitly express the distribution of $(x_n\,|\,x_{n-1})$? Should I decide a joint distribution for $x_n$ and $x_{n-1}$ (i.e., expressing $x_n$ as a function of $x_{n-1}$)?

Thanks.