Is there any way to find probability of marcov chain when the time is same?

36 Views Asked by At

I am just wondering if I am given P(Xn=1 given that Xn=0) (Usual One Marcov chain)

Can you find this probability using transition matrix?