Let $$A=(a_{ij})_{3 \times 3}=\begin{pmatrix} 0.5 & 0.5 & 0 \\ 0.5 & 0 & 0.5 \\ 0 & 0.5 & 0.5 \end{pmatrix}_{3 \times 3}$$
where $a_{ij}=Pr\{X_t+1=j | X_t=i\}$ where $X_t$ is the state of the Markov chain at Time $t$
$a)$ show that the matrix chain is ergodic
$b)$ compute the stationary probabilities $p_1,p_2,p_3$ for states $\{1,2,3\}$
$c)$ if the associated process starts at state $1$, derive a formula for the expectation of the time $T_3$ when the state $3$ appears for the first time
I have read the theoretical parts multiple times from different sources but I cannot grasp the idea of getting around this exercise. I would really appreciate an explanation of any of the $3$ points above as well as hints on how to proceed with them.
I'll try to get you started with (a) and (b):
(a) A chain is ergodic if some power of its transition matrix has all positive elements. For your chain, the second power has all positive elements.
(b) You have not shown any work, so I have no context to guess what you know and what method you are expected to use. Three possible approaches:
(i) Notice that the transition matrix is doubly-stochastic (columns as well as rows sum to unity). Thus because the chain is irreducible and ergodic, the stationary distribution is uniform on the three states.
(ii) If a vector $\sigma$ has $\sigma A = \sigma$ then $\sigma$ is a stationary distribution of the chain (and hence also the limiting distribution). Here $\sigma = (1/3, 1/3, 1/3).$ This can be found by solving three equations in three unknowns.
(iii) In general, $\sigma$ is proportional to the left eigenvector of smallest modulus. So it can be obtained in R (which uses right eigenvectors, hence the transpose
t(A)) as: