I'm having a lot of doubt on the formula of the standard deviation (in linear regression). My teacher and my syllabus say it's (for $\sigma_x$):
$$ \sigma_x = \sqrt{\dfrac{\displaystyle\sum (x-\bar{x})^2}{n}}$$
But my intuition tells me it's not. It tells me it should be:
$$ \sigma_x = \dfrac{\sqrt{\displaystyle\sum (x-\bar{x}})^2}{n}$$
It also worked in 1 exercise I made. I tried to google this trivial question, but if I look for standard deviation I get extremely long explanations as to what they are and a thousand different formulas.
For the variance, we do indeed divide $\sum(x-\overline{x})^2$ by $n$ (or, for certain purposes, by $n-1$).
What that means is that for the standard deviation, we divide the square root of $\sum(x-\overline{x})^2$ by $\sqrt{n}$. For recall that the standard deviation is the square root of the variance.