Normalizing features for regression

32 Views Asked by At

The context of this question is that of regularizers within regression

If we have $$\hat{u} = \alpha_1z_1 + \alpha_2z_2$$

where $$z_i = \frac{x_i - \bar{x_i}}{\sigma_{x_i}}$$

and we want to express $$\hat{y} = \beta_1x_1 + \beta_2x_2$$

Is it true that $\beta_i = \frac{\alpha_i - \bar{x_i}}{\sigma_{x_i}}$?

I am having trouble finding a way to solve for this via some system of equations

Edit: I should also add that $$\hat{u} = \frac{\hat{y}- \bar{y}}{\sigma_y}$$

1

There are 1 best solutions below

0
On BEST ANSWER

Given $$\hat{u} = \alpha_1z_1 + \alpha_2z_2$$ and that $$\hat{u} = \frac{\hat{y}- \bar{y}}{\sigma_y}$$

we have that

$$\hat{y} - \bar{y} = (\frac{\alpha_1\sigma_y}{\sigma_{x_1}}x_1 + \frac{\alpha_2\sigma_y}{\sigma_{x_2}}x_2) - (\frac{\alpha_1\sigma_y}{\sigma_{x_1}}\bar{x_1} + \frac{\alpha_2\sigma_y}{\sigma_{x_2}}\bar{x_2})$$

which implies the following:

$$\beta_j = \frac{\alpha_j\sigma_y}{\sigma_{x_j}}$$