I built a regression model estimating y(x),
I found that $α=8.2$ and $β=7.5$ and am trying to compare my results to the actual $y(x)$ that is given in the data to drow a conclusion regarding the error in my regression model.
I tried building a graph with a trendline that compares the two but I'm having a difficulty drowing any conclusions.
the equations for the trendlines are as follows,
$y=0.5x+340$ for the actual y(x)
$y=1.55x+350$ for the regression model
How can I compare my regression model with the actual data to estimate the error?
Short answer: there is no single way of looking at error - it all depends on your application. But there are a few popular methods of quantifying error, the most of which is Mean Square Error
Let $\bar{y}(x_i)$ be the prediction and $y(x_i)$ be the actual value
$$\text{MSE} = \frac{1}{N}\sum_{i=1}^N (\bar{y}_i - y_i)^2$$
Another way is to find the absolute mean error
$$\text{Absolute Error} = \frac{1}{N}\sum_{i=1}^N |\bar{y}_i - y|$$
You can search online for other error functions