Continuous-time Markov Chain forward/backward equations and MLE

580 Views Asked by At

enter image description here

I have two questions:

1) Using Kolmogorov's forward and backward equations, show that $p_{11}(t) + p_{21}(t) + p_{31}(t) = 1$ and $p_{21}(t) = p_{31}(t)$ where $p_{ij}(t) = P(X(t) = j | X(0) = i)$.

2) Write the likelihood function and obtain the maximum likelihood estimator (MLE) of $\lambda$ in terms of the total number of transitions $n$ over a given interval of time $t$.

My attempt:

For 1), I can show the first part but not the second, that is, for all $t \ge 0$, the forward equations for $p_{11}$, $p_{21}$, and $p_{31}$ are as follows: \begin{gather} p_{11}'(t) = \lambda p_{12}(t) + \lambda p_{13}(t) - 2\lambda p_{11}(t) \cdots (1)\\ p_{21}'(t) = \lambda p_{22}(t) + \lambda p_{23}(t) - 2\lambda p_{21}(t) \\ p_{31}'(t) = \lambda p_{32}(t) + \lambda p_{33}(t) - 2\lambda p_{31}(t) \end{gather}

The backward equations for $p_{11}$, $p_{21}$, and $p_{31}$ are as follows: \begin{gather} p_{11}'(t) = \lambda p_{21}(t) + \lambda p_{31}(t) - 2\lambda p_{11}(t) \cdots (2) \\ p_{21}'(t) = \lambda p_{11}(t) + \lambda p_{31}(t) - 2\lambda p_{21}(t) \\ p_{31}'(t) = \lambda p_{11}(t) + \lambda p_{21}(t) - 2\lambda p_{31}(t) \end{gather}

Subtracting $(1)$ from $(2)$ yields, \begin{gather} 0 = p_{21}(t) - p_{12}(t) + p_{31}(t) - p_{13}(t) \end{gather}

Noting that $1-p_{11}(t) = p_{12}(t) + p_{13}(t)$, we have: \begin{gather*} 0 = p_{21}(t) + p_{31}(t) - (1-p_{11}(t)) \\ \implies p_{11}(t) + p_{21}(t) + p_{31}(t) = 1 \end{gather*}

However, I'm not exactly sure how to show $p_{21}(t) = p_{31}(t)$ from algebraic manipulations... (I can show it by actually solving the differential equation but the question wants me to show it through manipulation)

For 2), I just want to confirm whether my working is correct.

For $i, j \in \{1, 2, 3\}$, let $n_{ij}$ denote the number of transitions from $i$ to $j$ and by $\tau_i$ the total time spent in $i$, then the likelihood function is given by: \begin{align*} L(\lambda) & = \prod_{i \neq j} q_{ij}^{n_{ij}} \exp\left(-\sum_{i=1}^m \tau_i \nu_i \right) \\ & = \lambda^{n_{12} + n_{13} + n_{21} + n_{23} + n_{31} + n_{32}} \exp\left(-2\lambda \left(\tau_1 + \tau_2 + \tau_3 \right) \right) \end{align*} Denote by $n$ the total number of transitions over a given interval of time $t$, then clearly $\sum_{i \neq j} n_{ij} = n$ and $\sum_{i=1}^3 \tau_i = t$, hence the likelihood is: \begin{align*} L(\lambda) & = \lambda^n \exp\left(-2\lambda t\right) \end{align*} The log-likelihood is given by, \begin{align*} l(\lambda) & = n\ln(\lambda) - 2\lambda t \end{align*} To find the MLE of $\lambda$, note that $\frac{\partial l}{\partial \lambda} = \frac{n}{\lambda} - 2t$, hence $\frac{\partial l}{\partial \lambda} = 0 \implies \hat{\lambda}_{MLE} = \frac{n}{2t}$

1

There are 1 best solutions below

2
On BEST ANSWER

The total number of transitions over a given interval of time of length $t$ is Poisson with parameter $2\lambda t$ hence $$ L_n(\lambda)=\mathrm e^{-2\lambda t}\frac{(2\lambda t)^n}{n!}. $$ Optimizing this over $\lambda$ yields the value you indicate, that is, $$ \widehat\lambda(n,t)=\frac{n}{2t}. $$ Note: A commoner situation is when the total number transitions over a given interval is unknown but one observes independent copies of the Markov chain at times $0$ and $t$ and one counts the number of copies whose values at times $0$ and $t$ are different.