If a fair die is thrown repeatedly Let {${B_{n}}$} where $B_{n}$ it is the time that remains to pass, being in the instant n, until the next one comes out.
I have a doubt in if this process is a Homogeneous Markov Chain. My partner says no(it's not even a Markov chain), because the state $X_{n+1}$ it does not depend on $X_{n}$
But I think it is, because the variables are independients, then:
$P(X_{n+1}|X_{n},X_{n-1},...,X_{0})=$ (by independece) = $P(X_{n+1})= P(X_{n+1}|X_{n})$
So i think the transition matrix, for example in the row i (where rows denote the instant n and the columns denote the number of toss until you get the first 1) is:
$(\frac{5^0}{6^1},\frac{5^1}{6^2},\frac{5^2}{6^3},\frac{5^3}{6^4},...)$ Because in the first entry there is $1/6$ of probability to obtain the firts 1. For the second entry it implies that at time 1 there is no 1, so for this we have $P= \frac{5}{6}$ and in the second toss we hace a 1, $(P= \frac{1}{6})$, so we have $(P= \frac{5}{6} * \frac{1}{6})$ etc. And equals to every row
I still have to prove that the sum gives 1, but I want to know if I'm modeling the problem correctly Or if the process is not even Markox's...
Thanks
A series of independent random variables is a (trivial) special case of a Markov chain. Remember that the only requirement of a Markov chain is that for all $n \in \mathbb{N}$ you must have:
$$p(X_{n+1} | X_{1}, ..., X_{n}) = p(X_{n+1} | X_{n}).$$
For a series of independent random variables you have:
$$p(X_{n+1} | X_{1}, ..., X_{n}) = p(X_{n+1}) = p(X_{n+1} | X_{n}).$$
Although this is technically a Markov chain, a series of independent random variables is far simpler than a standard Markov chain where the values are not independent. The results for the former are far simpler than the latter, so we do not bother to try to bring across results from the theory of Markov chains. (For every such result there is a much simpler version for sequences of independent random variable.) That is why it is unusual to hear a series of independent random variable referred to as being a Markov chain (though it is trivially true).