Conditioning on an event through conditioning on a sigma-algebra

191 Views Asked by At

Let's consider a probability space $(\Omega, \mathcal{F}, P)$. Kolmogorov's definition of conditional expectation needs a sub-sigma-algebra of $\mathcal{F}$ as a condition. Can we define the conditional expectation given an "event", usually defined via Bayes' theorem in elementary coarses, using Kolmogorov's conditional expectation?

Some more specific question is here. I have no idea how to define conditional expectation given an event. However, conditional expectation given a value of a random variable is possible. For a random variable $Y$, $E(X|Y)$ is defined as $E(X|\sigma(Y))$, this is a function defined on $\Omega$ but there exists a function $h$ such that $E(X|\sigma(Y)) = h \circ Y$. So we can define $E(X|Y = y)$ as $h(y)$. Can I consider this as $E(X|A)$ where $A = Y^{-1}(y)$ is an event? I don't think so. Let's consider another random variable $Z$ where $A = Z^{-1}(z)$ but $\sigma(Y) \ne \sigma(Z)$. In general, $E(X|Y = y) \ne E(X|Z = z)$. So the notion of $E(X|A)$ is not possible without specifying which random variable ($Y$ or $Z$) is in consideration. Clearly this is not a path to conditioning on an event, which does not need to be specified with a random variable, in terms of Kolmogorov's condition expectation. Is there a way to work around this problem? Is there some way directly using conditioning on a sigm-algebra without a random variable?

I think that the notion of conditional expectation (or probability) given an event is natural and guess that it should have some clear mathematical connection to conditional expectation given a sigm-algebra. What's the connection?

1

There are 1 best solutions below

1
On

When the random variable is a simple function $$ X=\sum_{i=1}^n1_{A_i} $$ and the $\sigma$-algebra we are conditioning on is finite, for simplicity say, ${\cal B}=\{\emptyset,B,B^c,\Omega\}$ then the relationship between the two notions of conditional probability/expectation is $$ \mathbb E[X|{\cal B}]=\sum_{i=1}^n\mathbb P(A_i|B)1_B+\sum_{i=1}^n\mathbb P(A_i|B^c)1_{B^c}\, $$ where the two conditional probabilities are defined through the events: $$ \mathbb P(A_i|B)=\frac{\mathbb P(A_i\cap B)}{\mathbb P(B)}\,,\quad \mathbb P(A_i|B^c)=\frac{\mathbb P(A_i\cap B^c)}{\mathbb P(B^c)}\,. $$