I'm struggling with a Markov Chain and I hope you can tell me if I'm on the right path.
I want to go Bowling with my colleagues and one of them tells me before that:
- Whenever he hits a strike, he’ll hit another strike in the next round with probability $0.8$.
- If he did not hit a strike, then in the next round he will hit a strike with probability $0.6$.
If this is true and his first attempt is a strike, what is the probability that his fourth attempt is also a strike? What is the long run percentage of strikes for this person for the entire (very long) evening?
I typically see that it is involving Markov chains and I need to construct a transition probability matrix.
Let's say: we have $i$ which can take the value 1 = hit a strike, 0 = doesn't hit a strike, my matrix would look like:
For the first part, I am asking for $P_{ii}^{(4)}$ if I get it right. How do I find it knowing already P_{ii} ?
For the second part, I should find the two unknown using
and
and then find the value we are looking for, namely
?