My question in full is: Why is the standardized regression coefficient in a regression model with more than one independent variable not the same as the correlation coefficient between x we interested in and y in a regression model with more than one independent variable?
From wikipedia that $$ \hat {\beta_i} = {\rm cor}(Y_i, X_i) \cdot \frac{ {\rm SD}(Y_i) }{ {\rm SD}(X_i) } $$
So $$ {\rm cor}(Y_i, X_i) = \hat {\beta_i} \cdot \frac{ {\rm SD}(X_i) }{ {\rm SD}(Y_i) } $$
The formula for the standardized regression coefficient is also: $$ standardizedBeta = \hat {\beta_i} \cdot \frac{ {\rm SD}(X_i) }{ {\rm SD}(Y_i) } $$
So shouldn't it be $$ standardizedBeta = {\rm cor}(Y_i, X_i) $$
?
Or is there something I missed?
Note that $$ \hat{\beta}_1 = r_{X,Y}\frac{s_X}{s_Y}, $$ is only holds for the simple model with an intercept term, i.e., for $$ y_i = \beta_0 + \beta_1x_i + \epsilon_i. $$ For multiple regression with $\{x_1,...,x_p\}$ the estimated coefficients are $$ \hat{\beta} = (X'X)^{-1}X'y. $$ Where the $k$th row of $X'X$ is $$ \left(\sum x_{ki}, \sum x_{ki}^2,..., \sum x_{ki}x_{ki}\right), $$ and $X'y$ is of the following form $$ \begin{pmatrix} \sum y_i \\ \sum x_{1i}y_i \\ :\\ \sum x_{pi}y_i \\ \end{pmatrix}, $$ hence I'm not sure that you can derive any insightful features from expressing explicitly the $\beta_k$ estimator in a multiple linear regression.