I know that uncorrelated variables need not be independent, e.g. $X$ and $|X|$ for $X \sim unif(-1, 1)$. But I'd like to make a stronger claim than this. My intuition tells me that outside of tiny finite probability spaces, independence is not only not guaranteed, but in fact is extremely rare, even among uncorrelated variables. Is there a rigorous sense in which this is true? In particular, suppose we fix a variable $X$ over a large space. Is there a natural measure under which the set of variables independent of $X$ is meagre, but the set of variables uncorrelated with $X$ is not?
2026-04-03 02:34:40.1775183680
Are most uncorrelated variables dependent?
43 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in MEASURE-THEORY
- On sufficient condition for pre-compactness "in measure"(i.e. in Young measure space)
- Absolutely continuous functions are dense in $L^1$
- I can't undestand why $ \{x \in X : f(x) > g(x) \} = \bigcup_{r \in \mathbb{Q}}{\{x\in X : f(x) > r\}\cap\{x\in X:g(x) < r\}} $
- Trace $\sigma$-algebra of a product $\sigma$-algebra is product $\sigma$-algebra of the trace $\sigma$-algebras
- Meaning of a double integral
- Random variables coincide
- Convergence in measure preserves measurability
- Convergence in distribution of a discretized random variable and generated sigma-algebras
- A sequence of absolutely continuous functions whose derivatives converge to $0$ a.e
- $f\in L_{p_1}\cap L_{p_2}$ implies $f\in L_{p}$ for all $p\in (p_1,p_2)$
Related Questions in INDEPENDENCE
- How to prove mutually independence?
- Simple example dependent variables but under some conditions independent
- Perturbing equivalent measures
- How to prove conditional independence properties
- How do I prove A and B are independent given C?
- Forming an orthonormal basis with these independent vectors
- Independence of stochastic processes
- joint probability density function for $ X = \sqrt(V) \cdot cos(\Phi) $ and $ Y = \sqrt(V) \cdot sin(\Phi) $
- How predictable is $Y$, given values of $X_i$s?
- Each vertex of the square has a value which is randomly chosen from a set.
Related Questions in CORRELATION
- What is the name of concepts that are used to compare two values?
- Power spectrum of field over an arbitrarily-shaped country
- How to statistically estimate multiple linear coefficients?
- How do I calculate if 2 stocks are negatively correlated?
- A simple question on average correlation
- Two random variables generated with common random varibales
- Correlation of all zero rows and columns in matrix
- Calculating correlation matrix from covariance matrix - r>1
- Joint probability of (X+Z) $\land$ (Y+Z) for arbitrary distribution of X, Y, Z.
- Phase space: Uncorrelated Gaussian Summed With a Linearly Correlated Gaussian
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Partial answer: I'll we consider probability distributions on a finite sample space $D^2$, that is, functions $f(x,y)$ from $D^2$ to $\mathbb{R}$ with $f(x,y)\geq0$ for all $x$ and $y$, and $$\sum_{x\in D}\sum_{y\in D}f(x,y)=1,$$
If $|D|=n$, then the space of all such $f$ is an $n^2-1$ dimensional simplex $F$. The constraint that $X$ and $Y$ be uncorrelated is a single constraint on elements of $F$, namely that $r(f)=0$, so I'd expect the space of all uncorrelated distributions to be a surface of dimension $n^2-2$ within the simplex.
The constraint that $X$ and $Y$ be uncorrelated is much stricter - it places (I think) $(n-1)^2$ independent constraints on $f$, namely that
$$f(x_i,y_i)=\left(\sum_{x\in D}f(x,y_i\right)\times\left(\sum{y\in D}f(x_i,y)\right)$$
Yes, that looks like $n^2$ constraints, but (I think) $(n-1)^2$ of them imply the rest. In any case, it's a lot more than $1$ constraint on $f$, so I'd expect the space of distributions for which $X$ and $Y$ are independent to be a surface of dimension $n^2-1 - (n-1)^2 = 2n-2$ within the simplex.
Therefore the set of all uncorrelated distributions has a higher dimension than the set of all independent distributions, especially as $n$ gets large or infinite.