This video is talking about Bayes Theorem and Hidden Markov Models.
If a random day (day_0) is sunny, the next day (day_1) is sunny in a probability of 80%, rainy in a probability of 20%.
If a random day (day_0) is rainy, the next day (day_1) is sunny in a probability of 40%, rainy in a probability of 60%.
Given above the lecturer conclude that without any other information a random day is rainy in a probability of 1/3 and sunny in a probability of 2/3
Is this reasonable? if yes, how does that happen?
I did a little calculation for this.
$S_1 = 0.8 S_0 + 0.4 R_0$
$R_1 = 0.2 S_0 + 0.6 R_0$
$R_0 + S_0 = 1$
$R_1 + S_1 = 1$
where $R_0$ represents the probability of a random day (day_0) is rainy $S_0$ represents sunny for that day; $R_1$ represents the probability of rainy the day after day_0, that is day_1, $S_1$ represents the probability of day_1 is sunny.
How can I conduct to the probabilities 1/3 and 2/3 ?
So you have that $$S_1=0.8 S_0+0.4(1-S_0)\\=0.4S_0+0.4$$
So what about the next day? $$S_2=0.4S_1+0.4=0.16S_0+0.56\\S_3=0.4S_2+0.4=0.064S_0+0.624\\S_4=0.4S_3+0.4=0.0256S_0+0.6496$$
And so on and so on, ad infinitum. As you see, the sequence $[S_0,S_1, \ldots S_{n},\ldots]$ will converge towards a limit, with a now obvious value, which we may call this $S_\infty$. $$S_\infty=0.4 S_\infty+0.4\\[2ex]0.6S_\infty=0.4\\[2ex]S_\infty=2/3$$
This is the equilibrium value for the probability that a day is sunny.