Consider a hidden Markov model $\lambda=\{A,B,q\}$, where $q$ is the initial probability matrix, $A$ the transition probability matrix, and $B$ the output probability distributions.
Assume that we have $N$ observations $\{x_n\}$ collected in a sequence $\underline x=(x_1,...,x_n,...,x_N)$ and we have calculated all necessary variables using the forward and backward algorithm. For example, we have the probability of being in state $i$ at instance $n$ given all the observations, $\gamma_{i,n}=P(S_n=i|\underline x,\lambda)$. We also have the forward, backward, and scale variables, $\alpha_{i,n}$, $\beta_{i,n}$, and $c_n$, etc.
Now, ordinary we can use the Viterbi algorithm to calculate the most probable state sequence $\underline i = (i_1,...,i_n,...,i_N)$ given our observations $\underline x$ and HMM $\lambda$ using $\text{argmax}_{\underline i} P(S_1=i_1,...S_N=i_N|x_1,...,x_N)$.
However, how can we calculate the probability of an arbitrary state sequence $\underline S^*=(i_n,...,i_{n+M})$ of length $M$ starting at instance $n$, given all our observations $\underline x=(x_1,...,x_N)$?
That is, how can be obtain the following probability? $$P(\underline S^*=(i_n,...,i_{n+M})|\underline x=(x_1,...,x_n,...,x_N))$$
Is this possible?
Since you say you have all the initial probabilities $\pi$, transition probabilities $\psi$, and emission probabilities $\phi$, the probability of an arbitrary sequence in the middle of the chain would be given by the following. We need to marginalize all the other possible states except for the ones we are interested in; the summation represents the sum over all possible states. The product term in the middle (in brackets) is not summed over; this is the states that we want the probability of being.
$$\sum_{i_1}\sum_{i_2}\dots\sum_{i_{n-1}}\sum_{i_{n+M+1}}\dots\sum_{i_N}\pi_1(i_1)\phi_1(i_1)\prod_{u={2}}^{n-1}\psi(i_{u-1}, i_u)\phi_u(i_u)\left[\prod_{t=n}^{n+M}\psi(i_{t-1}, i_t)\phi_t(i_t)\right]\prod_{v={n+M+1}}^N\psi(i_{v-1}, i_v)\phi_v(i_v)$$