Determining the probability on Markov chain question

668 Views Asked by At

The weather on any given day in a particular city can be sunny, cloudy, or rainy. It has been observed to be predictable largely on the basis of the weather on the previous day.

Specfically: If it is sunny on one day, it will be sunny the next day $5/6$ of the time, and never be cloudy the next day. If it is cloudy on one day, it will be sunny the next day $2/3$ of the time, and never be cloudy the next day. If it is rainy on one day, it will be sunny the next day $1/3$ of the time, and be cloudy the next day $1/2$ of the time Using 'sunny', 'cloudy', and 'rainy' (in that order) as the states in a system, set up the transition matrix for a Markov chain to describe this system.

Use your matrix to determine the probability that it will rain on Thursday if it is sunny on Sunday.

I have found the matrix, I just am not sure how to find the probablity

$[5/6,2/3,1/3]$

$[0,0,1/2]$

$[1/6,1/3,1/6]$