Does $Corr(X, Y)$ give the slope of the line representing the linear relationhip between the two?
i.e.
total trials $=100$
$N_R = 100 - N_B$
$N_B = 100 - N_R$
Thus, $Corr(X, Y) = -1$
If so, can the same be said for $Cov(X, Y)$?
Does $Corr(X, Y)$ give the slope of the line representing the linear relationhip between the two?
i.e.
total trials $=100$
$N_R = 100 - N_B$
$N_B = 100 - N_R$
Thus, $Corr(X, Y) = -1$
If so, can the same be said for $Cov(X, Y)$?
On
No, the correlation is a measure of the alignment of the points. The closer to $\pm1$ (which are the extreme values), the better the alignment. The correlation, the covariance and the slope have the same sign, that's all.
On
Since it is possible to have a slope that is greater than $1$, it is not true.
In terms of simple linear regression, the slope is given by $\frac{Cov(X,Y)}{Var(X)}=r_{xy}\frac{s_y}{s_x}$ which is not equal to the correlation.
On
Nope! It's just a measure of how linear your data are -- if the regression line fits the data very well then $Cor(X,Y) \approx 1$ or $-1$.
One way to see this is to consider a dataset where your second feature is perfectly determined by the first, e.g. some data like $\{(x_i,2x_i)\}_{i=1}^n$. Clearly these data are perfectly correlated -- they all lie on the line $y = 2x$, and so
$$ Cor(X,Y) = \frac{Cov(X,2X)}{\sqrt{Var(X)Var(2X)}} = \frac{2Var(X)}{\sqrt{4Var(X)Var(X)}} = 1 $$
But the regression line (also $y = 2x$) has slope 2, and so the correlation of your covariates doesn't equal the slope of your regression line.
Correlation between two variables is a measure of the strength of a linear relationship. It will always be a value between $-1$ and $1$ inclusive, or in rare cases it can be undefined (ex. all the points lie on a perfectly straight horizontal or vertical line, in which case one of our variables is constant.)
The slope of a line of best fit, or regression line, is different from correlation, but they do share the same sign, which can be seen if you look at the formulas for both.