I've just started studying Markov chains this year using my brother's old notes from college and I've been finding them quite useful so far. However, there is one thing that I'm not that sure of.
If I'm given an intensity matrix/transition rate matrix and I want to find the corresponding transition probability matrix, I understand that the probability of jumping from state x to state y is $\frac{qxy}{qx}$ where qxy is the transition intensity from state x to state y, but how do I find the probability of remaining in state x?
Since the rows of the transition rate matrix sum to 0 row-wise, all the probabilities of jumping from x to other states in the state-space (i.e. not remaining in state x) will sum to 1 when using the $\frac{qxy}{qx}$ formula which implies that the probability of remaining in state x is 0. This is something else that I don't understand. Is there another formula that I should be using?
Thanks for reading!