Let $(\Omega, \mathcal{F}, \mathbb{P})$ be a probability space, and let $X, Y, Z:\Omega\rightarrow[-\infty,\infty]$ be independent random variables, such that $X, Y$ are $\mathbb{P}$-integrable and identically distributed. By symmetry considerations, intuitively $E(X|X+Z) = E(Y|Y+Z)$. Can this intuition be given a valid formal statement? For instance, is it true that $E(X|X+Z) \overset{d}{=} E(Y|Y+Z)$? If so, how can this be proved rigorously?
2026-04-01 02:50:20.1775011820
Rigorous proof that $E(X|X+Z) = E(Y|Y+Z)$ when $X, Y, Z$ are independent and $X\overset{d}{=} Y$
118 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in MEASURE-THEORY
- On sufficient condition for pre-compactness "in measure"(i.e. in Young measure space)
- Absolutely continuous functions are dense in $L^1$
- I can't undestand why $ \{x \in X : f(x) > g(x) \} = \bigcup_{r \in \mathbb{Q}}{\{x\in X : f(x) > r\}\cap\{x\in X:g(x) < r\}} $
- Trace $\sigma$-algebra of a product $\sigma$-algebra is product $\sigma$-algebra of the trace $\sigma$-algebras
- Meaning of a double integral
- Random variables coincide
- Convergence in measure preserves measurability
- Convergence in distribution of a discretized random variable and generated sigma-algebras
- A sequence of absolutely continuous functions whose derivatives converge to $0$ a.e
- $f\in L_{p_1}\cap L_{p_2}$ implies $f\in L_{p}$ for all $p\in (p_1,p_2)$
Related Questions in CONDITIONAL-EXPECTATION
- Expectation involving bivariate standard normal distribution
- Show that $\mathbb{E}[Xg(Y)|Y] = g(Y) \mathbb{E}[X|Y]$
- How to prove that $E_P(\frac{dQ}{dP}|\mathcal{G})$ is not equal to $0$
- Inconsistent calculation for conditional expectation
- Obtaining expression for a conditional expectation
- $E\left(\xi\text{|}\xi\eta\right)$ with $\xi$ and $\eta$ iid random variables on $\left(\Omega, \mathscr{F}, P\right)$
- Martingale conditional expectation
- What is $\mathbb{E}[X\wedge Y|X]$, where $X,Y$ are independent and $\mathrm{Exp}(\lambda)$- distributed?
- $E[X|X>c]$ = $\frac{\phi(c)}{1-\Phi(c)}$ , given X is $N(0,1)$ , how to derive this?
- Simple example dependent variables but under some conditions independent
Related Questions in SYMMETRY
- Do projective transforms preserve circle centres?
- Decomposing an arbitrary rank tensor into components with symmetries
- A closed manifold of negative Ricci curvature has no conformal vector fields
- Show, by means of an example, that the group of symmetries of a subset X of a Euclidean space is, in general, smaller than Sym(x).
- How many solutions are there if you draw 14 Crosses in a 6x6 Grid?
- Symmetry of the tetrahedron as a subgroup of the cube
- Number of unique integer coordinate points in an $n$- dimensional hyperbolic-edged tetrahedron
- The stretch factors of $A^T A$ are the eigenvalues of $A^T A$
- The square root of a positive semidefinite matrix
- Every conformal vector field on $\mathbb{R}^n$ is homothetic?
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
I will show that $$ \mathbb{E}(X|X+Z) \overset{d}{=} \mathbb{E}(Y|Y+Z)\tag{*} $$ by proving the more general claim:
We can obtain $(*)$ from the claim by setting $$ \begin{align*} \Omega_1 &= \Omega_2 = \Omega,\\ \mathcal{F}_1 &= \mathcal{F}_2 = \mathcal{F},\\ \mathbb{P}_1 &= \mathbb{P}_2 = \mathbb{P},\\ U^1_1 &= X,\\ U^1_2 &= X + Z,\\ U^2_1 &= Y,\\ U^2_2 &= Y + Z. \end{align*} $$
Note that since (a) $X, Z$ are independent, (b) $Y, Z$ are independet, (c) $X \overset{d}{=} Y$, we have $(X,Z) \overset{d}{=}(Y,Z)$, and therefore $X+Z \overset{d}{=} Y+Z$ and $(X, X+Z) \overset{d}{=} (Y, Y+Z)$.
Proof of the claim
By definition of conditional expectation, there are Borel functions $\alpha_1, \alpha_2:[-\infty,\infty]\rightarrow[-\infty,\infty]$ such that $\mathbb{E}_{\mathbb{P}_i}(U^i_1|U^i_2) = \alpha_i\circ U^i_2$. Hence, $\alpha_i\circ U^i_2$ is $\mathbb{P}_i$-integrable. Hence, $\alpha_i$ is $\mathbb{P}_{U^i_2}$-integrable.
For every Borel set $A \subseteq \mathbb{R}$, define $g_A:\mathbb{R}^2\rightarrow\mathbb{R}$ by $g_A(x,y) = x\mathbb{1}_A(y)$. Then $U^i_1\mathbb{1}_{\{U^i_2\in A\}} = U^i_1\big(\mathbb{1}_A\circ U^i_2\big) = g_A\circ U^i$.
Then, for every Borel set $A \subseteq \mathbb{R}$ we have $$ \begin{align*} \int_A \alpha_i\ d\mathbb{P}_{U^i_2} &= \int_{\{U^i_2 \in A\}}\alpha_i\circ U^i_2\ d\mathbb{P}_i\\ &= \int_{\{U^i_2 \in A\}}\mathbb{E}_{\mathbb{P}_i}(U^i_1|U^i_2)\ d\mathbb{P}_i\\ &= \int_{\{U^i_2 \in A\}}U^i_1\ d\mathbb{P}_i\\ &= \int U^i_1\mathbb{1}_{\{U^i_2 \in A\}}\ d\mathbb{P}_i\\ &= \int g_A\circ U^i\ d\mathbb{P}_i\\ &= \int g_A\ d\mathbb{P}_{U^i}. \end{align*} $$
Since $U^1 \overset{d}{=} U^2$, we have $\mathbb{P}_{U^1} = \mathbb{P}_{U^2}$, and therefore $\int_A \alpha_1\ d\mathbb{P}_{U^1_2} = \int_A \alpha_2\ d\mathbb{P}_{U^2_2}$ for every Borel set $A \subseteq \mathbb{R}$. Since $U^1_2 \overset{d}{=} U^2_2$, we have $\mathbb{P}_{U^1_2} = \mathbb{P}_{U^2_2}$, and therefore for every Borel set $A \subseteq \mathbb{R}$ we have $\int_A \alpha_1 - \alpha_2\ d\mathbb{P}_{U^i_2} = 0$, which implies that $\alpha_1 - \alpha_2 = 0$ $\mathbb{P}_{U^i_2}$-a.s., i.e. that $\alpha_1 = \alpha_2$ $\mathbb{P}_{U^i_2}$-a.s.
Now, let $A \subseteq \mathbb{R}$ be a Borel set. Then $$ \begin{align*} \mathbb{P}_i\big(\mathbb{E}_{\mathbb{P}_i}(U^i_1|U^i_2) \in A\big) &= \mathbb{P}_i\big(\alpha_i\circ U^i_2 \in A\big)\\ &= \mathbb{P}_{U^i_2}(\alpha_i \in A)\\ &= \mathbb{P}_{U^i_2}\big(\{\alpha_i \in A\}\cap\{\alpha_1 = \alpha_2\}\big). \end{align*} $$
It follows that $\mathbb{P}_1\big(\mathbb{E}_{\mathbb{P}_1}(U^1_1|U^1_2) \in A\big) = \mathbb{P}_2\big(\mathbb{E}_{\mathbb{P}_2}(U^2_1|U^2_2) \in A\big)$ for every Borel set $A \subseteq \mathbb{R}$. In other words, $\mathbb{E}_{\mathbb{P}_1}(U^1_1|U^1_2) \overset{d}{=} \mathbb{E}_{\mathbb{P}_2}(U^2_1|U^2_2)$, Q.E.D.
Remark
The above proof shows that if $(\Omega_1, \mathcal{F}_1, \mathbb{P}_1) = (\Omega_2, \mathcal{F}_2, \mathbb{P}_2)$ and if $U^1_2 = U^2_2$ $\mathbb{P}$-a.s., then $\mathbb{E}(U^1_1|U^1_2) = \mathbb{E}(U^2_1|U^2_2)$ $\mathbb{P}$-a.s. ($\mathbb{P} = \mathbb{P}_1 = \mathbb{P}_2$.)
Indeed, under these conditions define $$ \begin{align*} N_0 &= \{U^1_2 \neq U^2_2\},\\ N_i &= \big\{U^i_2 \in \{\alpha_1 \neq \alpha_2)\}\big\},\quad i \in \{1,2\}\\ N &= N_0 \cup N_1 \cup N_2. \end{align*} $$ Then $$ \begin{align*} \mathbb{P}(N_0) &= 0,\\ \mathbb{P}(N_i) &= \mathbb{P}_{U^i_2}(\alpha_1 \neq \alpha_2) = 0,\quad i \in \{1,2\}\\ \mathbb{P}(N) &= 0, \end{align*} $$ and for every $\omega \in \Omega\setminus N$ we have $$ \big(\mathbb{E}(U^1_1|U^1_2)\big)(\omega) = (\alpha_1\circ U^1_2)(\omega) = (\alpha_2\circ U^1_2)(\omega) = \big(\mathbb{E}(U^2_1|U^2_2)\big)(\omega). $$
In particular, $\mathbb{E}(X|X+Y) = \mathbb{E}(Y|X+Y)$ $\mathbb{P}$-a.s.