We had the ergodic theorem for Markov chains, stating that: For a state space $S \subset \mathbb{N}$ and all functions $f \in L^1$ (meaning that $\sum_{s \in S} |f(s)|\pi(s) < \infty$) and an irreducible positive recurrent Markov chain $(X_n)$ with $X_n:\Omega \rightarrow S$, we have that $$\frac{1}{n+1} \sum_{i=0}^{n}f(X_i) \rightarrow E_{\pi}(f) $$ almost sure, where $\pi$ is the stationary distribution and for any starting distribution we could imagine.
As I received a lot of protest, I want to be more precise about my problem. After we proved this theorem our lecturer continued with a remark that I did not understand: First, he told us that for any indicator function $\chi_A$, where $A \subset S$, we have $\frac{1}{n+1} \sum_{i=0}^n \chi_A(X_i) \rightarrow \pi(A)$ almost sure.
Now, we concluded from this that for the starting distribution $P_{X_0} :=\delta_x$ for any $x \in S$ we have that $P_{X_i} \rightarrow \pi$ in distribution. $P_{X_i}$ is the pushforward measure, such that $P_{X_i} (s) = P(X_i=s)$ for $s \in S$.
I did not understand this, but maybe anybody here has an idea: how this could follow/ what our lecturer missed / what could be meant here?
Answer: It does not follow. I don't know what your lecturer meant or said, but you cannot draw this conclusion, as Michael's example shows.
Let $(X_n)$ be a Markov chain with state space $\{0,1\}$ and transition matrix $P=\pmatrix{0&1\cr 1&0}$. The unique invariant distribution is $\pi=(1/2,1/2)$, but $\mathbb{P}_0(X_n=0)=(-1)^n$ does not converge to $\pi(0)=1/2$.
What is missing is the additional assumption that the Markov chain is aperiodic. In that case, what your lecturer said is true.