There are many materials about the reverse question: "Does uncorrelatedness tell us something about independence?" But how to answer the question I've posed and why? Is there some simple counterexample?
2026-03-26 06:13:58.1774505638
Does independence of two random variables imply uncorrelatedness?
54 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in INDEPENDENCE
- How to prove mutually independence?
- Simple example dependent variables but under some conditions independent
- Perturbing equivalent measures
- How to prove conditional independence properties
- How do I prove A and B are independent given C?
- Forming an orthonormal basis with these independent vectors
- Independence of stochastic processes
- joint probability density function for $ X = \sqrt(V) \cdot cos(\Phi) $ and $ Y = \sqrt(V) \cdot sin(\Phi) $
- How predictable is $Y$, given values of $X_i$s?
- Each vertex of the square has a value which is randomly chosen from a set.
Related Questions in CORRELATION
- What is the name of concepts that are used to compare two values?
- Power spectrum of field over an arbitrarily-shaped country
- How to statistically estimate multiple linear coefficients?
- How do I calculate if 2 stocks are negatively correlated?
- A simple question on average correlation
- Two random variables generated with common random varibales
- Correlation of all zero rows and columns in matrix
- Calculating correlation matrix from covariance matrix - r>1
- Joint probability of (X+Z) $\land$ (Y+Z) for arbitrary distribution of X, Y, Z.
- Phase space: Uncorrelated Gaussian Summed With a Linearly Correlated Gaussian
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
I will try to give an intuitive answer.
To say that two random variables $X$ and $Y$ are uncorrelated means that $\mathbb{E}[XY] = \mathbb{E}[X] \mathbb{E}[Y]$. On the other hand, saying $X$ and $Y$ are independent means that their joint distribution is the product of their marginal distributions, $p(X,Y) = p(X) \cdot p(Y)$; or, equivalently, that either joint distribution is the same as its marginal distribution $p(Y|X) = p(Y)$.
Now, suppose that $X$ and $Y$ are correlated. Then there must be values $x_{1}$ and $x_{2}$ for which $\mathbb{E}(Y|X=x_1) \neq \mathbb{E}(Y|X=x_2)$.*
Since the expectations are unequal, it is clear that $p(Y|X=x_1) \neq p(Y|X=x_2)$. But if $X$ and $Y$ were independent, we would have $p(Y|X=x_1) = p(Y) = p(Y|X=x_2)$. Thus correlation implies dependence and, by the contrapositive, independence implies uncorrelation.
* This is easiest to see by assuming that $\mathbb{E}[Y]$ is constant and working out $\mathbb{E}[XY]$ by integration