Law of large numbers and "Random Walk" example

263 Views Asked by At

In wikipedia, the following is written about the Law of large numbers: It is a theorem that describes the result of performing the same experiment a large number of times. According to the law, the average of the results obtained from a large number of trials should be close to the expected value and tends to become closer to the expected value as more trials are performed.

As an example of this theorem we were given that of "Random Walk" where:

$S_i$ for $i=1,...,N$ are N continuous random variables, independent of each other, and in our example they represent N steps.

Then we are given : $x=\sum_{i=1}^NS_i$.

At this point I have two questions:

  1. If I try to understand this example, while having in mind the definition above, then is it correct to say that: one experiment (the quantity we measure in this experiment) is here represented by $S_i$ in an abstract way. The person can perform a random walk several times(each time N steps are done), and in each of those "random walks" an arbitrary $S_i$th step will be characterized by a different value of the length done (assuming that the length done is one of the physical quantities that can be measured for an arbitrary step. Other quantities might be the energy spend to do this step, the speed etc). And this is valid for all the N steps. Would this be a right interpretation of the initial part of the above definition for this example?
  2. What is x and how would be the correct way to describe it? Would it be correct if I say, "x" is a continuous random variable,which represents to total length covered after N steps?

After calculations we also get:

$\bar x=\sum_{i=1}^N \bar S_i$. How would one describe with words what this is, how is derived ?

Also $\sigma^2_x=\sum_{i=1}^N \sigma_{\bar S_i}^2$, where $\sigma^2$ is meant to represent the variance of a random variable. Same questions for this: How would one describe with words what this is, how is derived ?

And finally, the above definition says:"...According to the law, the average of the results obtained from a large number of trials should be close to the expected value and tends to become closer to the expected value as more trials are performed."

My question is: Where exactly in this example, we see this happening? At no point in our lecture, the argument was made that we measure the total length so many times, that the average becomes similar to the expectation value. + The average is the expectation value, or at least, it's formula is that of the expected value.

Unless: one can make the argument that in this example, the expected value that is being measured or for which we are interested is that of the total length covered, which is found with : $\bar x=\sum_{i=1}^N \bar S_i$, and where each of the N averages of the N independent continuous variables, because of multiple measurements, they (the averages) are equal to the expected values of each of the N variables.

1

There are 1 best solutions below

0
On

For question 1, I guess your understanding is correct. But you may have not read the definition of random variable. It is kind of what you say.

A random variable is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes in a sample space to a measurable space, often the real numbers.

For question 2, the answer is yes when you regard $S_i$ as the length of the $i$th step not the other kind of quantity (like energy).

In the case of one-dimensional random walk, length with a direction (or vector) is a proper understanding since I can go forward or backward.

Because I know nothing about your lecture, I cannot understand well the latter part of your question. I just post something I know similar to what you say.

The definition of sample mean $\bar{x}$ or $\bar{S}$ (which is also a random variable) is usually

$$\bar{x} = \frac{1}{N} \sum_{i=1}^{N} S_i$$

because $\{S_i\}$ are called samples.

I am not sure about what your notation $\bar{S}_i$ means.

The variance of sample mean may be written as

$$ \begin{split} \sigma_{\bar{x}}^2 &= \mathrm{var} \left[\frac{1}{N} \sum_{i=1}^{N} S_i\right]\\ \end{split}$$

Usually, when someone learns about law of large numbers, random variable distribution (expectation, mean, variance) has been introduced. So, I assume you know something about them.

Assume that $\{S_i\}$ follows the same distribution with mean (or expectation) $\mu$ and variance $\sigma^2 < \infty$ and they are independent. The simplest version of law of large numbers says

$$\lim_{n \to \infty} \bar{x} = \mu$$

$$\lim_{n \to \infty} \sigma_{\bar{x}}^2 = \frac{\sigma^2}{n}$$

These could be derived from linearity of expectation and the definition of variance. You may as well try it by yourself.

If you wonder about what you would see in this example, you could just flip coin 20 times. If head, move one step forward. If tail, move one step backward. The final point (or the vector from the start point) is $x$. $x / 20$ is $\bar{x}$. When you increase the flipping times, $x$ is more and more likely close to the start point (I assume the coin is even).

Sample variance of $\{S_i\}$

$$\frac{1}{N - 1} \sum_{i=1}^N(S_i - \bar{x})^2$$

would be close to $\sigma^2 = 1$.

If you calculate $\bar{x}$ many times (denote it by M),

$$\frac{1}{M} \sum_{i=1}^{M} \bar{x}_i \approx 0$$

$$\frac{1}{M - 1} \sum_{i=1}^M(\bar{x}_i - \frac{1}{M} \sum_{i=1}^{M} \bar{x}_i)^2 \approx \frac{1}{N}$$

$N$ is the number of samples you used to calculate $\bar{x}$ every time.