x and y are two columns of financial data which have been standardized. Assuming one implements a simple linear regression on x and y, is it possible to observe a slope greater than 1?
I ran some numbers in Excel and cannot get the slope to ever exceed 1. Can someone please explain the mathematical reason why this is impossible?
If standardised represents location and scale changes forcing each mean to $0$ and standard deviation to $1$
then the simple least-squares linear regression line should have zero intercept and a slope equal to the correlation, which must be somewhere in the interval $[-1,1]$.
If you had not standardised, the slope would have been $r_{xy} \dfrac{s_y}{s_x}$ but you have made $s_y=s_x=1$.