Weak law of large numbers - problems

202 Views Asked by At

I am currently studying the weak law of large numbers and I have understood the concept behind it. If we take a sample that is enoughly big, the mean of this sample will converge to the sequence of random variables X1,...,Xn that would be very convenient because most of the time. The sequence of random variables called Population is too big to be observed.

My problem is : I cannot find a proper example, problem that uses this theorem. All I found is the proof of it which is not required at this time of my course because my professor has not introduced the Chebyshev inequality...

Could you post here small problemss that I could work on to grasp and practice the mechanic behind the concept ?

Thank you very much

2

There are 2 best solutions below

0
On

Let's take a fair coin and set $H=1$ and $T=0$

the random variable is the following

$$X= \begin{cases} \frac{1}{2}, & \text{if $X=0$} \\ \frac{1}{2}, & \text{if $X=1$} \end{cases}$$

We have $\mathbb{E}[X]=\frac{1}{2}$ and $\mathbb{V}[X]=\frac{1}{4}$

Now let's take the rv $\overline{X}_n$, the sample mean. You surely know that

$\mathbb{E}[\overline{X}_n]=\frac{1}{2}$ and $\mathbb{V}[\overline{X}_n]=\frac{1}{4n}$

Thus the mean of the sample mean is $\frac{1}{2}$ and its variance is going to zero as $n \rightarrow +\infty$

Thus you can realize that the sample mean "converges" in probability to the population mean, $\frac{1}{2}$ as its variance is very very close to zero.


Formally these condition on mean and variance is an iff condition for the convergence in $L^2$ which implies the convergence in probability

0
On

Really great you are seeking to play around with the concepts! I have a couple of examples to show you here! You will need Chebsky's inequality: If $E(X) = \mu$ and Var($X) = \sigma^2$ then $P(|X-\mu|>\epsilon) < \frac{\sigma^2}{\epsilon^2}$

$\textbf{My fisherman father:} $
My father loves to fish (he really does by the way!) each day he goes to Monk lake and if its sunny he fishes for 9 hours. If it is raining he fishes instead for 6 hours. The probability he catches one fish in any given hour, independent of others, is $\frac{1}{3} $ and the probability he doesn't is $\frac{2}{3} $. Lets say it rains half of the time and is sunny the other half. Lets let the variables $X_W, X_M, X_Y, X_D $ denote respectively the average number of fish he catches per day in a week, month, year and decade of fishing. (i.e $X_Y$ = $\frac{1}{365} \cdot$ #Fish he caught that year. ) Can you tell me:

  1. The Expected value of each of the $X_i$ call this $\mu_i$ for $i = W,M,Y,D $
  2. The Variance for each $X_i$ call this $\sigma^2_i$
  3. An upper bound (using Markov's/Chebyshev's inequality) of $P( |X_i - \mu_i|>0.01)$ i.e the probability that $X_i$ differs from $\mu_i$ by at least $0.01$

$\textbf{Terrible tennis players:} $
I am bad at tennis, very bad. Let us say each time I play my girlfriend I only have a 20% chance of winning each match (believe me this is a generous estimation.) She agrees to the following bet: Each time we play a match, if I win I get £10 and each time and if I loose I have to pay her £3. Let us say we have played $1000$ games.

  1. What was the expected average amount of money it cost me to play each game. i.e what was the expected average I lost per game?
  2. What is the variance of this random number?
  3. Again, using Markov's/Chebyshev's inequality, what is and upper bound for the probability that I either made money, or lost more than £400?


Now do these questions if we played 10000, 100000 and 1 million games.

$\textbf{Farmer Groggstern and his wild cabbage} $
Farmer Groggstern grows cabbage, a lot of it. The weight of each cabbage he grows is perfectly modelled by a normal distribution of Mean $2$kg and Variance $0.5^2$. He decides to harvest 500 cabbages to make soup for his entire village, coincidently of population 500 people. He turns all the cabbage into soup with entirely efficient mass transfer and adds no extra ingredients, yuck! (i.e 1kg of cabbage makes 1kg of soup) he then divides all the soup he has into 500 equal portions.

  1. What is the expected amount of soup each person gets?
  2. What is the variance of this random number?
  3. What is an upper bound for the probability everyone gets 1.9 - 2.1 Kg of soup
  4. What is an upper bound for the probability everyone gets 1.8 - 2.2 Kg of soup
  5. What is an range for which I am 99.99% confident that everyones soup mass is within \

Now do these questions if he harvests 1000, 2000 and 5000 cabbages for respectively 1000, 2000 and 5000 people

I hope these helped you ! If you want answers to any of these questions please comment below! Oskar :)