A Markov model is used to describe the stochastic gating of particles through channels. A channel is permeable exactly when four gates are open in it - this can be represented by the following Markov diagram.
$$n_0 \overset{4 \alpha}{\underset{\beta}{\rightleftharpoons}} n_1 \overset{3 \alpha}{\underset{2 \beta}{\rightleftharpoons}} n_2 \overset{2 \alpha}{\underset{3 \beta}{\rightleftharpoons}} n_3 \overset{ \alpha}{\underset{4 \beta}{\rightleftharpoons}} n_4 $$
where $\alpha, \beta$ are transition rates.
and NOW: I read the following:" Exact or Markov methods model channel noise as continuous time Markov processes to iterate through the transition-probability matrix of state change to infer the exact number of ion channels opened during each time step"
let us consider we are at time $t$ - how can i calculate the population of open channels at $t + \delta t$ .. i do not really understand the description above.
Let $X(t)$ be the number of open channels at time $t$ i.e. $X(t)=i$ when the processs is in state $n_i$ at time $t$. Then $\{X(t):t\geqslant 0\}$ is a continuous-time Markov chain with infinitesimal generator $$ Q = \left( \begin{array}{ccccc} -4 \alpha & 4 \alpha & 0 & 0 & 0 \\ \beta & -3 \alpha -\beta & 3 \alpha & 0 & 0 \\ 0 & 2 \beta & -2 (\alpha +\beta ) & 2 \alpha & 0 \\ 0 & 0 & 3 \beta & -\alpha -3 \beta & \alpha \\ 0 & 0 & 0 & 4 \beta & -4 \beta \\ \end{array} \right). $$ The non-diagonal entries of $Q$ denote the "rate" at which the process is moving its current state to a different state; more precisely, conditioned on $\{X(t)=i\}$, the time to the next transition $$ \inf\{s>t:X(s)\ne X(t)\}-t $$ is exponentially distributed with intensity $$ \sum_{j=0}^4 Q_{ij}\cdot\mathsf 1_{\{Q_{ij}>0\}}. $$ (This follows from the minimum of finitely many independent exponential random variables having intensity the sum of said intensities.) Given some initial distribution $\gamma$ of $X(0)$, the sequence of states $\{T_n: n=0,1,2,\ldots\}$ the process visits defines an embedded discrete-time Markov chain $\{Y_n: n=0,1,2,\ldots\}$, with probability transistion matrix $$ R = \left( \begin{array}{ccccc} 0 & 1 & 0 & 0 & 0 \\ \frac{\beta }{3 \alpha +\beta } & 0 & \frac{3 \alpha }{3 \alpha +\beta } & 0 & 0 \\ 0 & \frac{\beta }{\alpha +\beta } & 0 & \frac{\alpha }{\alpha +\beta } & 0 \\ 0 & 0 & \frac{3 \beta }{\alpha +3 \beta } & 0 & \frac{\alpha }{\alpha +3 \beta } \\ 0 & 0 & 0 & 1 & 0 \\ \end{array} \right). $$ (Recall that if $\{Z_i : i \in\{1,\ldots,n\}$ is a finite collection of independent exponentially distributed random variables with respective intensities $\lambda_i$, for each $i$ the probability of $\left\{\bigwedge_{j=1}^n Z_j=Z_i\right\} = \frac{\lambda_i}{\sum_{j=1}^n\lambda_j}$.
For each $t>0$, there is similarly a transition matrix $P(t)$ where $P(t)_{ij} = \mathbb P(X(t)=j\mid X(0)=i\}$. These matrices satisfy the backward Kolmogorov differential equations $P'(t) = QP(t)$ with initial condition $P(0)=I$ (the identity matrix), which has the unique solution $e^{Qt}$ where $$ e^{Qt} := \sum_{n=0}^\infty \frac{(Qt)^n}{n!}. $$ The rows of $Q$ sum to zero, and hence for any $t>0$ and nonnegative integer $n$ we have $$\max_i\left\{\sum_j \left|\left(I - \frac{Qt}n \right)^n_{ij}\right| \right\}<1.$$ It follows then that the matrix exponential $e^{Qt}$ converges for any $t$, and so we may compute the derivative of $Q(t)$ term-by-term: $$ Q'(t) = \frac{\mathsf d}{\mathsf dt} \left[\sum_{n=0}^\infty \frac{(Qt)^n}{n!}\right]\\ =\sum_{n=0}^\infty \frac{\mathsf d}{\mathsf dt} \left[\frac{(Qt)^n}{n!}\right] = Qe^{tQ}, $$ thus verifying that $Q'(t)=QP(t)$.