We know from elementary probability theory, two random variable $X,Y$ are called independent if $F_X(x)F_Y(y)=F_{XY}(x,y).$ where the notation are their cdf. But in measure space, assume $X,Y$ $\sim$ $(\Omega,\mathcal{F},\mathcal{P}).$ If we take $\Omega=[0,1],$ suppose $w_1\in \Omega=[0,1]$ and $X(w_1)=1$, how to explain $Y$ in this $(\Omega,\mathcal{F},\mathcal{P})$ in order to make $X$ and $Y$ independent? Can you give me an example or some related theorems to help me deeply understand this? Thank you so much!
2026-03-31 14:27:12.1774967232
How to use advanced probability theory to explain "Independent" of two randan variable?
94 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in REAL-ANALYSIS
- how is my proof on equinumerous sets
- Finding radius of convergence $\sum _{n=0}^{}(2+(-1)^n)^nz^n$
- Optimization - If the sum of objective functions are similar, will sum of argmax's be similar
- On sufficient condition for pre-compactness "in measure"(i.e. in Young measure space)
- Justify an approximation of $\sum_{n=1}^\infty G_n/\binom{\frac{n}{2}+\frac{1}{2}}{\frac{n}{2}}$, where $G_n$ denotes the Gregory coefficients
- Calculating the radius of convergence for $\sum _{n=1}^{\infty}\frac{\left(\sqrt{ n^2+n}-\sqrt{n^2+1}\right)^n}{n^2}z^n$
- Is this relating to continuous functions conjecture correct?
- What are the functions satisfying $f\left(2\sum_{i=0}^{\infty}\frac{a_i}{3^i}\right)=\sum_{i=0}^{\infty}\frac{a_i}{2^i}$
- Absolutely continuous functions are dense in $L^1$
- A particular exercise on convergence of recursive sequence
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in MEASURE-THEORY
- On sufficient condition for pre-compactness "in measure"(i.e. in Young measure space)
- Absolutely continuous functions are dense in $L^1$
- I can't undestand why $ \{x \in X : f(x) > g(x) \} = \bigcup_{r \in \mathbb{Q}}{\{x\in X : f(x) > r\}\cap\{x\in X:g(x) < r\}} $
- Trace $\sigma$-algebra of a product $\sigma$-algebra is product $\sigma$-algebra of the trace $\sigma$-algebras
- Meaning of a double integral
- Random variables coincide
- Convergence in measure preserves measurability
- Convergence in distribution of a discretized random variable and generated sigma-algebras
- A sequence of absolutely continuous functions whose derivatives converge to $0$ a.e
- $f\in L_{p_1}\cap L_{p_2}$ implies $f\in L_{p}$ for all $p\in (p_1,p_2)$
Related Questions in INDEPENDENCE
- How to prove mutually independence?
- Simple example dependent variables but under some conditions independent
- Perturbing equivalent measures
- How to prove conditional independence properties
- How do I prove A and B are independent given C?
- Forming an orthonormal basis with these independent vectors
- Independence of stochastic processes
- joint probability density function for $ X = \sqrt(V) \cdot cos(\Phi) $ and $ Y = \sqrt(V) \cdot sin(\Phi) $
- How predictable is $Y$, given values of $X_i$s?
- Each vertex of the square has a value which is randomly chosen from a set.
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Here is an answer from a general measure-theoretic point of view.
If you have real-valued random variables $X,Y$ on $\Omega$, you can use them to obtain two (probability) measures on ${\bf R}$, namely $X[\mathcal P]$ and $Y[\mathcal P]$. At the same time, $(X,Y)$ is a ${\bf R}^2$-valued random variable on $\Omega$, which gives us a (probability) measure $(X,Y)[\mathcal P]$ on ${\bf R}^2$.
We say that $X,Y$ are independent if $(X,Y)[\mathcal P]=X[\mathcal P]\otimes Y[\mathcal P]$, or in other words, $(X,Y)[\mathcal P]$ is the product measure of $X[\mathcal P]$ and $Y[\mathcal P]$. By definition, it means that for any intervals $I,J\subseteq {\bf R}$ you have $$ \mathcal P((X,Y)^{-1}[I\times J])=\mathcal P(X^{-1}[I])\cdot \mathcal P(Y^{-1}[J]) $$ Since $(X,Y)^{-1}[I\times J]=X^{-1}[I]\cap Y^{-1}[J]$, this is equivalent to saying that for any intervals $I,J$ you have $$ \mathcal P(X\in I\land Y\in J)= \mathcal P(X\in I)\cdot \mathcal P(Y\in J). $$ I.e. the events $X\in I$ and $Y\in J$ are independent. It is easy to see that you can take half-lines instead of intervals here, so in fact this is equivalent to saying that for any real numbers $a,b$ the events $X\leq a$ and $Y\leq b$ are independent, which is exactly the statement you have given.
The first definition generalises immediately to random variables taking value in an arbitrary measure space (except then you may have no notion of a cumulative distribution function, or you may have to consider more complicated sets than just intervals), and in fact measurable functions on any measure space. It also works more or less the same for more than two variables (even infinitely many).