Let there be two random variables $X$ and $Y$ with a certain joint copula. Is it always true that there is another random variable $Z$ independent from $Y$ such as the vectors (X,Y) and (X,Z) have the same law?
2026-02-23 10:45:38.1771843538
On the existence of a random variable with contraints
41 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in MEASURE-THEORY
- On sufficient condition for pre-compactness "in measure"(i.e. in Young measure space)
- Absolutely continuous functions are dense in $L^1$
- I can't undestand why $ \{x \in X : f(x) > g(x) \} = \bigcup_{r \in \mathbb{Q}}{\{x\in X : f(x) > r\}\cap\{x\in X:g(x) < r\}} $
- Trace $\sigma$-algebra of a product $\sigma$-algebra is product $\sigma$-algebra of the trace $\sigma$-algebras
- Meaning of a double integral
- Random variables coincide
- Convergence in measure preserves measurability
- Convergence in distribution of a discretized random variable and generated sigma-algebras
- A sequence of absolutely continuous functions whose derivatives converge to $0$ a.e
- $f\in L_{p_1}\cap L_{p_2}$ implies $f\in L_{p}$ for all $p\in (p_1,p_2)$
Related Questions in CONDITIONAL-EXPECTATION
- Expectation involving bivariate standard normal distribution
- Show that $\mathbb{E}[Xg(Y)|Y] = g(Y) \mathbb{E}[X|Y]$
- How to prove that $E_P(\frac{dQ}{dP}|\mathcal{G})$ is not equal to $0$
- Inconsistent calculation for conditional expectation
- Obtaining expression for a conditional expectation
- $E\left(\xi\text{|}\xi\eta\right)$ with $\xi$ and $\eta$ iid random variables on $\left(\Omega, \mathscr{F}, P\right)$
- Martingale conditional expectation
- What is $\mathbb{E}[X\wedge Y|X]$, where $X,Y$ are independent and $\mathrm{Exp}(\lambda)$- distributed?
- $E[X|X>c]$ = $\frac{\phi(c)}{1-\Phi(c)}$ , given X is $N(0,1)$ , how to derive this?
- Simple example dependent variables but under some conditions independent
Related Questions in INDEPENDENCE
- How to prove mutually independence?
- Simple example dependent variables but under some conditions independent
- Perturbing equivalent measures
- How to prove conditional independence properties
- How do I prove A and B are independent given C?
- Forming an orthonormal basis with these independent vectors
- Independence of stochastic processes
- joint probability density function for $ X = \sqrt(V) \cdot cos(\Phi) $ and $ Y = \sqrt(V) \cdot sin(\Phi) $
- How predictable is $Y$, given values of $X_i$s?
- Each vertex of the square has a value which is randomly chosen from a set.
Related Questions in COPULA
- Why does countermonotonicity induce lower Fréchet–Hoeffding copula bounds?
- Empirical Copula when there are thresholds for no data pairs
- What is the expectation of a copula function?
- Copula simulation
- How do we prove that $\max\{x_1 + x_2+ \ldots + x_n - n + 1,0\} \leq C(\textbf{x}) \leq \min\{x_1,x_2,\ldots,x_n\}$?
- How to derive the density of Gaussian Copula?
- On the existence of a random variable with contraints
- Joint distribution of two uniform random variables
- Gaussian and Student-t copulas derivatives
- Joint probability from marginal and relation between variables
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
It depends on what you mean by "exist". Taken completely literally, no. For instance, if the sample space is $\Omega = \{0,1,2,3\}$ with each point having probability $1/4$, and $X$ and $Y$ are the random variables
$$X(0) = X(1) = 0, X(2) = X(3) = 1, \\ Y(0) = Y(2) = 0, Y(1) = Y(3) = 1,$$ then any random variable $Z$ such that $(X,Y)$ and $(X,Z)$ have the same law must satisfy $Z=Y$.
However, what I suspect you mean to ask is "given some joint law $\mu$, can we always find random variables $X$, $Y$ and $Z$ defined on some shared probability space such that $(X,Y)$ and $(X,Z)$ both have joint law $\mu$, and $Y$ and $Z$ are independent?"
If you do not know measure theory, the TL;DR version is yes, this is always possible. If you interested in the measure-theoretic proof, then proceed.
To prove this, let $\lambda$ be the marginal distribution of $X$ and $\nu_x$ be the distribution of $Y$ conditional on $X=x$. (What I am really doing here is disintegrating the measure $\mu$; see the Wikipedia article on disintegrations for details.) So specifically, we have
$$ \mu(A\times B) = \int_A \nu_x(B)\lambda(dx) $$
for all Borel sets $A,B$. Now, all we have to do is consider the Borel probability measure $\mathbb P$ on $\mathbb R^3$ which satisfies
$$ \mathbb P(A\times B\times C) = \int_A \nu_x(B)\nu_x(C) \lambda(dx).$$
The joint law of both the first and second components, and the first and third components, is $\mu$, and the second and third components are independent. Explicitly, let $\Omega = \mathbb R^3$ with its Borel $\sigma$-field and probability measure $\mathbb P$ defined above, and define
$$ X(x,y,z) = x, \qquad Y(x,y,z) = y, \qquad Z(x,y,z) = z. $$
Then by construction, $(X,Y)$ and $(X,Z)$ both have law $\mu$, and $Y$ and $Z$ are independent.