Here is my question. If A and B has a correlation of 0.8, B and C has a correlation of 0.6. What is the range of correlation between A and C?
2025-01-12 19:23:51.1736709831
Correlation between three variables
982 Views Asked by Febin George https://math.techqa.club/user/febin-george/detail At
1
There are 1 best solutions below
Related Questions in CORRELATION
- Is the sum of all cross-correlation samples representative of target existence likelihood?
- Correlation matrix
- Show $X_1$ and $X_2$ are negatively correlated
- Why is $R^2=\rho^2$
- Correlation between three variables
- why do odd magic squares have the same rank as their size?
- If X and Y are positively correlated poisson variates, show that X+Y cannot be a poisson variate
- Help with understanding point from Kahneman's book “Thinking Fast and Slow”
- Make an equation inversely relating a radius to a punctuation
- Meaning of "White noise uncorrelated in time"?
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Refuting the Anti-Cantor Cranks
- Find $E[XY|Y+Z=1 ]$
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- What are the Implications of having VΩ as a model for a theory?
- How do we know that the number $1$ is not equal to the number $-1$?
- Defining a Galois Field based on primitive element versus polynomial?
- Is computer science a branch of mathematics?
- Can't find the relationship between two columns of numbers. Please Help
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- A community project: prove (or disprove) that $\sum_{n\geq 1}\frac{\sin(2^n)}{n}$ is convergent
- Alternative way of expressing a quantied statement with "Some"
Popular # Hahtags
real-analysis
calculus
linear-algebra
probability
abstract-algebra
integration
sequences-and-series
combinatorics
general-topology
matrices
functional-analysis
complex-analysis
geometry
group-theory
algebra-precalculus
probability-theory
ordinary-differential-equations
limits
analysis
number-theory
measure-theory
elementary-number-theory
statistics
multivariable-calculus
functions
derivatives
discrete-mathematics
differential-geometry
inequality
trigonometry
Popular Questions
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- How to find mean and median from histogram
- Difference between "≈", "≃", and "≅"
- Easy way of memorizing values of sine, cosine, and tangent
- How to calculate the intersection of two planes?
- What does "∈" mean?
- If you roll a fair six sided die twice, what's the probability that you get the same number both times?
- Probability of getting exactly 2 heads in 3 coins tossed with order not important?
- Fourier transform for dummies
- Limit of $(1+ x/n)^n$ when $n$ tends to infinity
Let $A=(a_1,\ldots,a_n)$.
Let $\bar a=(a_1+\cdots+a_n)/n$.
Let $A_\text{corrected} = (a_1,\ldots,a_n) - (\bar a,\ldots,\bar a)$.
Let $\displaystyle \operatorname{ss}_A^2 = \langle A_\text{corrected},A_\text{corrected}\rangle$ where $\langle\cdot, \cdot\rangle$ is the dot product, the usual inner product and $\operatorname{ss}$ stands for "sum of squares". And let $\operatorname{ss}_A = \sqrt{\operatorname{ss}_A}$$.
Then $\displaystyle \left\langle \frac{A_\text{corrected}}{\operatorname{ss}_A}, \frac{A_\text{corrected}}{\operatorname{ss}_A} \right\rangle = 1$.
And similarly for $B$ and $C$.
Then the correlation is $\displaystyle \operatorname{corr}(A,B) = \left\langle \frac{A_\text{corrected}}{\operatorname{ss}_A}, \frac{B_\text{corrected}}{\operatorname{ss}_B} \right\rangle = \cos\theta_{AB}$, where $\theta_{AB}$ is the angle between these two unit vectors.
Then $\theta_{AB} = \arccos 0.8$ and $\theta_{BC}= \arccos0.6$.
The correlation $\cos\theta_{AC}$ must be between $\cos(\theta_{BC} \pm \theta_{AB})$. Notice that $\theta_{AB}+\theta_{BC} = 90^\circ$ because of the Pythagorean theorem.
This puts the correlation between $0$ and $0.96$ --- not a very narrow range, but it rules out a negative correlation between $A$ and $C$.
Here I've dealt only with a finite sample in which I've assigned each point a weight of $1/n$, but the whole thing can also work for discrete distributions on infinite set and for continuous distributions and other distributions. One would use integrals rather than sums for continuous distributions.