I have a problem I fail to research properly, so I hope you may at least push me in the right direction (or maybe even provide me an answer right away?).
I know how linear regression works, that it attempts to find a linear curve such that the sum of all residual errors to the square of the given data points is minimal (least squares). Now, how does one go about, when data points have deviations attached? I have a set of data points, where every point has a unique deviation. The linear regression should still work as usual, but how do I get the resulting error bar of the linear regression curve? Also, since deviations fluctuate, I'd assume some sort of weighted method would make more sense in order to prioritize those data points with less deviation relative to those with larger deviation. What weighted method would make most sense in my case?
Thanks a lot for your reply!