Assume a random walk, start at 0, step size Gaussian N(1,σ²).
After many many steps (i.e. n=1,000,000), the translation distance is N(n, nσ²).
Here is my confusion: N(n, nσ²) means the probabilty of getting exactly n translation distance is getting lower and lower (nσ²); but according to the law of large number, the chance of getting n translation distance is getting bigger and bigger.
I must be missing something obvious but I am kind of stuck...
Many thanks in advance!
The law of large numbers states that the sample average of a sequence of independent and identically distributed random variables converges* to the expected value of the distribution.
There are two main points you're missing. Firstly, you're considering a sum of a variables. To apply the law of large numbers, you need to divide by n and get a sample average $S_n = \frac{1}{n}\sum_n {X_n}$, where $X_n \sim \mathcal{N}(1, \sigma^2)~\text{i.i.d.}$ So you'll get $S_n \sim \mathcal{N}(1, \frac{\sigma^2}{n})$, whose variance approaches zero as $n \to \infty$. Secondly, the law of large numbers is about expectation. If you consider the probability of getting exactly a given translation distance $r$ in your random walk it will always be zero, since the distribution is continuous. This also holds for the sample average for any finite $n$.
* There are different modes of "convergence" depending on the specific form of the law of large numbers (e.g. weak law vs strong law), but that's beyond the point of this answer.