Suppose there is $n$ data points $(x_i,y_i)$ and $i=1,...,n$, sampled from a line in 2D modelled by $y = m_n x + b_n$ where $m_n \sim \mathcal{N}(0,\sigma^2_m)$ and $b_n \sim \mathcal{N}(0,\sigma^2_b)$. How can I estimate $\sigma_m$ and $\sigma_b$ very very roughly but in a deterministic way?
I already know the answer is not unique. Namely $\sigma_m$ can be zero and $\sigma_b$ will be variance of data. But what is the most naive-inaccurate way to have a reasonable value for both (both non-zero)?
Generally, this seems like a problem where a variation bootstrap would be quite reasonable. Given that $E[b_n] = 0,$ I would recommend performing the following steps: 1. Simulate the data generating process many times to get $(Y^n,X^n)$ each time. 2. Fit a linear model $\hat{Y} = \hat{m}X$ on each of the simulated datasets, and save each $\hat{m}$ for later 3. calculate the sample variance for your distribution of $\hat{m}$'s that you have generated.