Understanding a $\sigma$-field

58 Views Asked by At

Problem

Let's consider $n$ Bernoulli trials with the probability of getting a success equal to $p$. My task is to find the expected value of getting a success in the first trial on condition that we know how many successes have occurred in all trials.

My attempt

The problem is quite easy. I define two variables

  1. $S_n$ - the amount of successes,
  2. $X$ - the amount of successes in the first trial.

I would like to find the value of $\mathbb{E}(X | \sigma(S_n))$.
Now I would like to do something with $\sigma(S_n)$. I defined new sequence of events: $$A_k = \{S_n = k\} \tag{1}.$$ Now $\sigma(S_n) = \sigma(A_0, \ldots, A_n)$.
The further calculations are very easy. The answer to this problem is $\frac{S_n}{n}$.

What I don't understand

I don't really understand how does $\sigma(A_0, \ldots, A_n)$ look like. To my mind there are a lot of events that are completely impossible. Let me give you an example.
Let's fix $n=1$. That gives us: $$A_0, A_1$$ and $$\sigma(A_0, A_1) = \{\emptyset, \Omega = A_0 \cup A_1, A_0, A_1 \}.$$ Am I correct? How can I interpret $A_0 \cup A_1$?

1

There are 1 best solutions below

0
On BEST ANSWER

If $S_1$ can't take any values other than $0$ and $1$ (even on null sets) then you are correct.

$A_0 \cup A_1$ is the event that either $S_1 = 0$ or $S_1 = 1$ occurs (or both, which in this case isn't possible). You can always think of unions as "or" and of intersections as "and".

In general, $\sigma(S_n)$ could be interpreted as the smallest $\sigma$-algebra that contains all the information about $S_n$.