I am a novice to Probability calculation. I am using Markov Chain to estimate the probability of failure of a component. However, I am unsure how to calculate the probability of failure, if I know that at a given time in the future the component has actually failed? Or, how to calculate the probability of subsequent failures in the future?
Edit: The sample Markov process is shown in 1:
Essentially, defined the Markov States and associated failure, repair rates, and the initial probability of each state, the Markov Process calculates the probability of the system in state $i$ at a given time $t$. However, in the real-time, the system can be in one of the four states shown the figure. Now suppose, there exists a cost if the system is in either of the failed states ($2, 3, 4$). And the cost will only be encountered if the system makes a transition from one state to another. Then, given a finite time for which the system operates, I wish to calculate the expected cost incurred by me for running the system.
My confusion is, if, I assume the system makes a transition from one state to another (say definitely), then, do I need to change the probability of being in each of the states? How do I calculate the probability of subsequent failure?
