I have a couple problems understanding the conditional expectation as a random variable. Consider the fair dice roll as a random variable $X$. Let $C$ be the event that the dice shows a one and consider $A=\sigma(C)$. We have $E[X|C]=1, E[X|C^c]=4$ and thus one version of the expected value of $X$ conditioned on $A$ is $E[X|A]=1\cdot \chi_C+4\cdot\chi_{C^c}$. Now this is a function from $\Omega$ to $\mathbb{R}$. I can't grasp how to interpret $E[X|A](\omega)$, since in general $E[X|A]$ is characterized by having certain integrals over the $\sigma$-field, which defines it up to null sets. Of course, in this case there is no ambiguity, but in a countable setting $E[X|A]$ is an equivalence class. I do understand that $E[X|A]$ is basically averaging of $X$ over $C$ and $C^c$. But how do I interpret $E[X|A](\omega)$ for a specific $\omega$.
2026-04-07 14:36:46.1775572606
Interpretation of conditional expectation as a random variable
185 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in CONDITIONAL-EXPECTATION
- Expectation involving bivariate standard normal distribution
- Show that $\mathbb{E}[Xg(Y)|Y] = g(Y) \mathbb{E}[X|Y]$
- How to prove that $E_P(\frac{dQ}{dP}|\mathcal{G})$ is not equal to $0$
- Inconsistent calculation for conditional expectation
- Obtaining expression for a conditional expectation
- $E\left(\xi\text{|}\xi\eta\right)$ with $\xi$ and $\eta$ iid random variables on $\left(\Omega, \mathscr{F}, P\right)$
- Martingale conditional expectation
- What is $\mathbb{E}[X\wedge Y|X]$, where $X,Y$ are independent and $\mathrm{Exp}(\lambda)$- distributed?
- $E[X|X>c]$ = $\frac{\phi(c)}{1-\Phi(c)}$ , given X is $N(0,1)$ , how to derive this?
- Simple example dependent variables but under some conditions independent
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
In short, $E(X|\mathcal{A})(\omega)$ is the value of the random variable $E(X|\mathcal{A})$ when outcome $\omega$ occurs.
It sounds like you are specifically looking at sub-$\sigma$-algebras that are generated by partitions of the entire collection of outcomes: $\Omega=\cup A_i$ where the $A_i$ are pair-wise disjoint and $\mathcal{A}=\sigma(A_1,A_2,...)$. To interpret $E(X|\mathcal{A})(\omega)$ in this setting, you simply find the element of the partition $B$ such that $\omega\in B$. Note that there is only one such $B$. Then we get that $E(X|\mathcal{A})(\omega)=E(X|B)$.
Let $\mathcal{A}=\sigma(C)$ for some outcome $C\subset\Omega$. For your example, either $\omega\in C$ or $\omega\in C^c$ since $\Omega=C\cup C^c$. Therefore
$$E(X|\mathcal{A})(\omega)=\left\{ \begin{aligned} &E(X|C)&\text{if} &\quad \omega\in C\\ &E(X|C^c) &\text{if} &\quad \omega\in C^c. \end{aligned} \right. $$
One way to think about it is that, as an observer, you cannot distinguish between the outcomes in event $C$. If $\omega_1$ and $\omega_2$ are both in $C$, then to you, they look identical to you even if $X(\omega_1)\neq X(\omega_2)$. If I tell you that the outcome of the experiment is a particular $\tilde{\omega}\in C$, then what you hear me say is that "event $C$ occurred". Even though I'm speaking the specific outcome name $ `` \tilde{\omega} " $, you hear only $ `` C " $. So you calculate the expectation of $X$ given the information you heard. You know for certain event $C$ has occurred, therefore you calculate $E(X|C)$.
$$E(X|C)=\frac{E(X1_C)}{P(C)}=\frac{1}{P(C)}\int_C X P(d\omega) \quad \text{so long as}\quad P(C)\neq0$$
Example:
Let $X$ be uniform on $\{1,2,3\}$, and $$\mathcal{A}=\sigma(\{1\})=\{\emptyset,\Omega,\{1\},\{2,3\}\}.$$ (Aside: Just to be careful, we have to remember that $\{1\}=\{\omega\in\Omega : X(\omega)=1\}$.)
We want to examine the random variable $E(X|\mathcal{A})$. Think of this random variable as representing your best estimate of $X$ when your own ability to distinguish between different outcomes of the experiment of checking the (random) value of $X$ is limited by the events in $\mathcal{A}$.
If the value of $X$ is $1$, you can see it perfectly, and you output $$E(X|\mathcal{A})(1)=E(X|\{1\})=1$$ because $\{1\}\in\mathcal{A}$. However, if $X$ takes on the value $2$ or $3$, you cannot distinguish between them so $$E(X|\mathcal{A})(2)=E(X|\mathcal{A})(3)=E(X|\{2,3\})=2.5.$$
Note that $E(X|\mathcal{A})(\omega)= E(X|\{2,3\})$ if $\omega\in\{2\}$ or if $\omega\in\{3\}$.