Let $\mathbf{Y} = (Y_1,...,Y_n)$ and $\mathbf{X} = (X_1,...,X_n)$ be two vectors of observations from variables $Y$ and $X$ respectively.
First, a linear regression model is used to find a relationship between the variables $Y$ and $X$. Thanks to $\mathbf{Y}$ and $\mathbf{X}$ we obtain a slope $\hat{m}$ and an intercept value $\hat{b}$ such that $(\hat{m},\hat{b})$ is given by the LSE and is an estimator of $(m,b)$ in the following equation: $$ Y = mX + b \ \ \ \ \ \ \ (1). $$ In a second step, if $\hat{b}$ is lower than a certain value $b^*$, then we will suppose that the intercept term $b$ is null and that the estimated values $\hat{Y}$ of $Y$ are simply: \begin{equation} \hat{Y} = \hat{m}X \ \ \ \ \ \ \ (2). \end{equation}
The goal is to find the (relative) standard deviation of the $\hat{Y}$ of equation (2) knowing that we used the slope given by the LSE of equation (1). Does anyone has an idea how to process?