Let $X$ and $Y$ be random variables with the following joint distribution in $[0, 1]$ × $[0, 1]$: $f (x, y) = 0$ in the white areas, $f (x, y) = 2$ in the black areas. These two variables are uncorrelated but are not independent?
2026-04-04 04:37:48.1775277468
Uncorrelated but not independent random variables.
138 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in INDEPENDENCE
- How to prove mutually independence?
- Simple example dependent variables but under some conditions independent
- Perturbing equivalent measures
- How to prove conditional independence properties
- How do I prove A and B are independent given C?
- Forming an orthonormal basis with these independent vectors
- Independence of stochastic processes
- joint probability density function for $ X = \sqrt(V) \cdot cos(\Phi) $ and $ Y = \sqrt(V) \cdot sin(\Phi) $
- How predictable is $Y$, given values of $X_i$s?
- Each vertex of the square has a value which is randomly chosen from a set.
Related Questions in DENSITY-FUNCTION
- Find probability density function for $\varepsilon \cdot X$.
- Density distribution of random walkers in a unit sphere with an absorbing boundary
- Find the density function of the sum $(X,X+Y)$.
- Conditional density function with gamma and Poisson distribution
- Variance of a set of quaternions?
- Generate uniformly distributed points in n-dimensional sphere
- probability density function-functions of random variables
- joint probability density function for $ X = \sqrt(V) \cdot cos(\Phi) $ and $ Y = \sqrt(V) \cdot sin(\Phi) $
- Joint probability density function of $X$ and $\frac{Y}{X}$
- Equivalent ways of writing Kullback-Leibler divergence
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?

They are indeed not independent as you stated. We know that two random variables $X, Y$ (for which the probability densities $f_X$, $f_Y$ and the joint probability density $f_{X,Y}$ exist) are independent iff:
$f_{X,Y}(x, y) = f_X(x)f_Y(y)$ for all $x, y$
An here it is easy to find a counterexample:
$f_{X,Y}(1/3, 1/3) = 0$
$f_X(1/3) = 1$, $f_Y(1/3) = 1$
So the condition for the variables to be independent does not hold.
However when we are talking about correlation we are usually referring to the Pearson Correlation Coefficient which measures $linear$ relationship between two random variables. Here there is no linear relationship between the two variables, so the correlation is $0$. You can see that in the bottom of the following picture some examples of obviously dependent random variables (like in your example), where the correlation is $0$.
In real-world examples the Pearson Coefficient works well in many cases. For when it doesn't, there are some other correlation measures such as Spearman's rank coefficient, Kendall rank coefficient and others that can be used (although there is no guarantee that they will always manage to point out the existence of a relationship between your variables).
Edit:
As user asked, I am explaining how the Pearson Coefficient can be calculated here. The formula is:
$ρ_{X, Y} = \frac{cov(X, Y)}{σ_Xσ_Y} = \frac{E[XY] - E[X]E[Y]}{σ_Xσ_Y}$
Here the most parts of the formula can be easily calculated:
$E[X] = E[Y] = 1/2$
$σ_X = σ_Y = \sqrt{1/12}$
The only difficult part is the calculation of $E[XY]$:
$E[XY] = \int_0^1 \int_0^1 xy f_{XY}(x, y) dx dy$
The problem here is that $f_{XY}$ isn't continuous in $[0, 1] \times [0, 1]$, so to calculate this integral we need to calculate the eight integrals for the eight areas where the probability density function is non-zero, and add them:
$\int_0^1 \int_0^1 xy f_{XY}(x, y) dx dy = \\ \int_{1/4}^{1/2} \int_0^{1/4} xy f_{XY}(x, y) dx dy + \int_0^{1/4} \int_{1/4}^{1/2} xy f_{XY}(x, y) dx dy + \\ \int_{1/4}^{1/2} \int_{1/2}^{3/4} xy f_{XY}(x, y) dx dy + \int_{1/2}^{3/4} \int_{1/4}^{1/2} xy f_{XY}(x, y) dx dy + \\ \int_{0}^{1/4} \int_{3/4}^{1} xy f_{XY}(x, y) dx dy + \int_{3/4}^{1} \int_{0}^{1/4} xy f_{XY}(x, y) dx dy + \\ \int_{1/2}^{3/4} \int_{3/4}^{1} xy f_{XY}(x, y) dx dy + \int_{3/4}^{1} \int_{1/2}^{3/4} xy f_{XY}(x, y) dx dy $
Here you can notice that due to the symmetry in your PDF, the two integrals in each of the lines have the same value, so you only need to calculate four integrals instead of eight. Perhaps there are smarter things you can do to calculate the original integral with less sweat, but I can't think of anything else.