A group of $3$ consecutive states is inspected. The first state is known to be Bad, find the expected number of these $3$ states that will be Good.
We need to find $E(G)$ knowing that: $$\mbox{S}_{1}=\left[ \begin{array}{c} 0 \\ 1 \end{array} \right]$$ $$Pr\left( \mbox{B|G} \right)=0.25$$ $$T=\left[ \begin{array}{cc} 0.75 & 0.55 \\ 0.25 & 0.45 \end{array} \right]$$
Attempt:
$$\mbox{S}_{1}+\mbox{S}_{2}+\mbox{S}_{3}=T^{0}\left[ \begin{array}{c} 0 \\ 1 \end{array} \right]+T^{1}\left[ \begin{array}{c} 0 \\ 1 \end{array} \right]+T^{2}\left[ \begin{array}{c} 0 \\ 1 \end{array} \right]=\left[ \begin{array}{c} 1.21 \\ 1.79 \end{array} \right]$$
Now here, I thought, shouldn't this give us $\left[ \begin{array}{c} Pr\left( G \right) \\ Pr\left( B \right) \end{array} \right]$? The values are greater than $1$, and I realised that the answer was $E(G)=1.21$,
Why isn't $\mbox{S}_{1}+\mbox{S}_{2}+\mbox{S}_{3}=\left[ \begin{array}{c} Pr\left( G \right) \\ Pr\left( B \right) \end{array} \right]$ and why is it $=\left[ \begin{array}{c} \mbox{E}\left( G \right) \\ \mbox{E}\left( B \right) \end{array} \right]$?
(Also is there another approach without using matrices?)
The conditional expectation of the number of Good states in $3$ steps, given that the first state is Bad is, indeed, $1.21$ as I will show below.
This is to answer the parenthetic question: "[...] is there another approach without using matrices?"
The states of this Markov process are $\{G,B\}$. Let the random variables $\sigma_i$ be defined as follows
$$\sigma_i=\begin{cases} 1,&\text{ if the process is in state G at the } i^{th} \text{step}\\ 0,&\text{ if the process is in state B at the } i^{th} \text{step.} \end{cases}$$
We know that $\sigma_1=0$ and our task is to calculate the following conditional expectation
$$E[\sigma_1+\sigma_2+\sigma_3\mid \sigma_1=0]=E[\sigma_2+\sigma_3\mid \sigma_1=0].$$
The state transition matrix of this Markov process is $$T^T= \begin{bmatrix} p_{G,G} & p_{G,B} \\ p_{B,G} & p_{B,B} \end{bmatrix} =\begin{bmatrix} 0.75 & 0.25 \\ 0.55 & 0.45 \end{bmatrix} $$
where $p_{X,Y}$ are the state transition probabilities. For example
$$p_{G,B}=P(\sigma_i=0\mid \sigma_{i-1}=1).$$
that is, the probability that the process jumps from state Good to state Bad.
Starting in state $B$ we have four possibilities for the next two steps: $BB$, $BG$, $GB$, and $GG$, ie. $\sigma_2+\sigma_3=0$, $\sigma_2+\sigma_3=1$, $\sigma_2+\sigma_3=2.$
So, the conditional expectation we seek is
$$E[\sigma_2+\sigma_3\mid \sigma_1=0]=1\cdot P(\sigma_2+\sigma_3=1\mid \sigma_1=0)+2\cdot P(\sigma_2+\sigma_3=2\mid \sigma_1=0).$$
Now,
$$P(\sigma_2+\sigma_3=1\mid \sigma_1=0)=P(\sigma_2=0\cap \sigma_3=1\mid \sigma_1=0)+P(\sigma_2=1\cap \sigma_3=0\mid \sigma_1=0)=$$ $$\frac{P(\sigma_2=0\cap \sigma_3=1\cap \sigma_1=0)}{P(\sigma_1=0)}+\frac{P(\sigma_2=1\cap \sigma_3=0\cap \sigma_1=0)}{P(\sigma_1=0)}=$$ $$=P(\sigma_3=1\mid \sigma_2=0)P(\sigma_2=0\mid \sigma_1=0)+P(\sigma_3=0\mid \sigma_2=1)P(\sigma_2=1\mid \sigma_1=0)=$$
$$=p_{B,G}p_{B,B}+p_{G,B}p_{B,G}=0.55\cdot 0.45+0.25\cdot 0.55=0.385.$$
Then $$P(\sigma_2+\sigma_3=2\mid \sigma_1=0)=P(\sigma_2=1\cap\sigma_3=1\mid\sigma_1=0)=p_{B,G}p_{G,G}=0.55\cdot0.75=0.4125.$$
So,
$$E[\sigma_2+\sigma_3\mid \sigma_1=0]=$$ $$=1\cdot(p_{B,G}p_{B,B}+p_{G,B}p_{B,G})+2\cdot p_{B,G}p_{G,G}=1\cdot0.385+2\cdot0.4125=1.21. \tag 1$$
Considering the "matrix solution" given by the OP:
$$T= \begin{bmatrix} p_{G,G} & p_{B,G} \\ p_{G,B} & p_{B,B} \end{bmatrix} $$
and
$$T^2=\begin{bmatrix} p_{G,G}^2+p_{B,G}p_{G,B} & p_{G,G}p_{B,G}+p_{B,B}p_{B,G} \\ p_{G,B}p_{G,G}+p_{B,B}p_{G,B} & p_{G,B}p_{B,G}+p_{B,B}^2 \end{bmatrix}.$$
With this
$$\mbox{S}_{1}+\mbox{S}_{2}+\mbox{S}_{3}=T^{0}\left[ \begin{array}{c} 0 \\ 1 \end{array} \right]+T^{1}\left[ \begin{array}{c} 0 \\ 1 \end{array} \right]+T^{2}\left[ \begin{array}{c} 0 \\ 1 \end{array} \right]=$$ $$=\begin{bmatrix}0+p_{B,G}+p_{G,G}p_{B,G}+p_{B,G}p_{B,B}\\ 1+p_{B,B}+p_{G,B}p_{B,G}+p_{B,B}^2\end{bmatrix}.$$
But what is $0+p_{B,G}+p_{G,G}p_{B,G}+p_{B,G}p_{B,B}$? I claim that this is $$E[\sigma_1+\sigma_2+\sigma_3\mid \sigma_1=0]=E[\sigma_2+\sigma_3\mid \sigma_1=0].$$
Since $p_{G,B}+p_{G,G}=1$ we can say that
$$p_{B,G}+p_{G,G}p_{B,G}+p_{B,G}p_{B,B}=$$ $$=p_{B,G}(p_{G,B}+p_{G,G})+p_{G,G}p_{B,G}+p_{B,G}p_{B,B}=$$ $$=p_{B,G}p_{G,B}+p_{B,G}p_{G,G}+p_{G,G}p_{B,G}+p_{B,G}p_{B,B}=$$ $$=p_{B,G}p_{G,B}+2p_{B,G}p_{G,G}+p_{B,G}p_{B,B}$$ just like in $(1)$.
The other equality could be checked the same way.