I have learned that, for an i.i.d. random process $\{X_k\}$, $k=1, 2, \ldots$, $$ \frac{1}{n}\sum_{k=1}^n{X_k}\longrightarrow EX_1 $$ in probability sense.
For a stationary random process $\{X_k\}$, $k=1, 2, \ldots$, $$ \frac{1}{n}\sum_{k=1}^n{X_k}\longrightarrow EX_1 $$ with probability $1$.
However, someone told me that for an i.i.d. random process, its sample mean converges to expectation value with probability $1$ as $n$ grows.
I am confused about the fact between WLLN and SLLN. Isn't there any problem that I admit that i.i.d. random processes are also having stationarity?
A strict(strong)-sense stationary process $\{X_t\}$ is one whose joint distributions for any set of times $t_1,\ldots,t_k$, that $F_X(t_1,\ldots,t_k) = F_X(t_1+\tau,\ldots,t_k+\tau)$ for any $\tau$.
An i.i.d. process always satisfies this, since its joint distribution at any set of times is the same.
If you have an i.i.d. sequence of finite mean random variables, then the sample average converges to the mean with probability one (strong law of large numbers) and in probability (weak law of large numbers).
Not all iid processes have finite means though (e.g. a sequence of i.i.d. Cauchy RV's) so the LLN's don't hold. And not all stationary processes will satisfy some conditions for a law of large numbers (in whatever sense -- mean square, probability, almost surely, etc.).