Suppose we have two possible states of some entity, $A$, and $B$. Say, the probability that the initial state is $A$ is $p_i$ and the probability that the initial state is $B$ is $1-p_i$
After the system has been initialized, it behaves as a markov chain with: $$P(A \to A) = p_{11}$$ $$P(A \to B) = 1- p_{11}$$ $$P(B \to B) = p_{22}$$ $$P( B \to A) = 1-p_{22}.$$
Now suppose your system is initialized, and you are not able to observe the initial state or the second state. If you observe the third state to be $B$, what is the probability that the initial state was $A$?
Correct me if I'm wrong, but is it not true that the observation of the third state has no effect on the initialization probabilities? And thus the answers is simply $p_i$?
Yes, your intuition is correct. To show this, write $X_i \in \{A,B\}$ to be the observations at state $i$. Then by Bayes, \begin{align*} P(X_1 = A \mid X_3 = B) &= \frac{P(X_3 = B \mid X_1 = A)P(X_1 = A)}{P(X_3 = B)}\\ &\stackrel{(*)}{=} \frac{P(X_3 = B)P(X_1 = A)}{P(X_3 = B)}\\ &= P(X_1 = A) \end{align*} where $(*)$ follows from the fact that $\{X_n\}$ is a Markov Chain.