Let $\{X_i\}_{i \in \mathbb{N}}$ be a sequence of i.i.d random variables taking value in the set $\{-1,1\}$ with $P(X_i = 1) = \frac{1}{2}$. Let $S_n = \sum\limits_{i=1}^n X_i$, and for $k \in \mathbb{N}$ define the random process $B^k$ as $$ B^k(t) = \frac{S_{[kt]}}{\sqrt{k}} \qquad t \in \mathbb{R} $$
where $[x]$ is the integer part of $x$.
I know that $(B^k(t))_{t \in \mathbb{R}}$ converges to the standard Brownian motion in distribution, does it also converges pointwise? i.e. is there a random process $\{B(t)\}_{t\in \mathbb{R}}$ so that $$ \forall t \in \mathbb{R} \quad \lim_{k \to \infty} B^k(t) = B(t) \quad \text{a.s} $$ and further that $B(t)$ is the Brownian motion.
No, it doesn't. By the Kolmogorov 0-1 law, if the a.s. (or even i.p.) limit exists, it is a constant, which is absurd.
To see this, let's take $t=1$ for simplicity. Suppose that $S_k / \sqrt{k} \to Y$ a.s. Fix $j$ and write, for $k \ge j$,
$$\frac{S_k}{\sqrt{k}} = \frac{X_1 + \dots + X_{j-1}}{\sqrt{k}} + \frac{X_{j} + \dots + X_k}{\sqrt{k}}$$
As $k \to \infty$, the first term goes to zero a.s. (since the numerator does not depend on $k$). So we have
$$Y = \lim_{k \to \infty} \frac{X_{j} + \dots + X_k}{\sqrt{k}}$$
which shows that $Y \in \sigma(X_j, X_{j+1}, \dots)$. But $j$ was arbitrary, so this shows that $Y$ is a tail random variable. Kolmogorov says the tail $\sigma$-field is almost trivial, meaning that $Y$ is a.s. equal to a constant.
Alternatively, the law of the iterated logarithm for random walk will also show you that, almost surely, $\limsup S_k / \sqrt{k} = +\infty$ and $\liminf S_k / \sqrt{k} = -\infty$.