Let $(X_t)_{t\in \mathbb{N}}$ be a Markov chain with transition matrix P, initial distribution $\mu$ and state space $\chi$.
I know that for $k \in \chi, \mathbb{P}(X_{t+1} = k | X_t =k) = P(k,k)$. I want to know how to calculate $\mathbb{P}(X_{t+1} = k | X_t \neq k)$
My initial idea was: $$\mathbb{P}(X_{t+1} = k | X_t \neq k) = \sum_{x \in \chi\\, x \neq k}\mathbb{P}(X_{t+1} = k | X_t = x)\mathbb{P}(X_t = x) =\sum_{x \in \chi\\, x \neq k} P(x,k)\mu_t(x) $$
Is this correct?
By mystake I ended up calculating $\mathbb{P}(X_{t+1} = k, X_t \neq k)$ instead of $\mathbb{P}(X_{t+1} = k| X_t \neq k)$ like geetha290krm noticed. So, the correct answer is: \begin{align} \mathbb{P}(X_{t+1} = k|X_t \neq k) &= \dfrac{\mathbb{P}(X_{t+1} =k, X_t \neq k)}{\mathbb{P}(X_t \neq k)}\\ &= \dfrac{\sum_{x \in \chi, x \neq k} P(x,k)\mu_t(x)}{1- \mathbb{P}(X_t =k)} \end{align}