I'm attempting to plot the error of a simulation compared to an analytical solution and am curious if my problem is intrinsic to my model or the way I am plotting the results.
The idea is that my log-log plot should produce a line with a slope of 1 showing that the rate of convergence is first-order. I am currently refering to a text called: Finite Element Analysis, from concepts to applications by David S. Burnett in Chapter 9.
My analytical solution is as follows:
$$ f(x) = \frac{(3e+8 \cdot x)}{6} \cdot (0.01 - x) + (600 - 300) \cdot \frac{x}{0.01} + 300 $$
Where $x$ is position along a 1-D bar of 0.01m length. I am then computing the error like:
$$ \text{Error} = \log_{10}\left(\frac{U-\tilde{U}}{U}\right) $$
Where $U$ is the analytical solution and $\tilde{U}$ is the simulated solution.
Below is a table of data displaying the results of my simulation run at the position of $0.00252242$. pos_random is the simulated solution at that position while t_sol is the analytical solution. t_err is the error.
model_id log_model_id pos_random t_sol t_err slope
1 1 0.00000 375.672600 1318.752467 -0.145615 0.000000
1 2 -0.30103 1006.277600 1318.752467 -0.625348 1.593640
1 4 -0.60206 1315.975100 1318.752467 -2.676530 6.813878
1 8 -0.90309 1317.376350 1318.752467 -2.981508 1.013115
1 16 -1.20412 1318.076975 1318.752467 -3.290543 1.026592
1 10000 -4.00000 1318.752467 1318.752467 -10.251518 2.489726
This data produces the plot below:
As you can see, the plot created does not display a continuously linear line. This goes against the text I am reading as well as does not follow 1st order model as the slope should be near 1. I am wondering if there is something wrong with my approach or if this is something that is probably intrinsic to my model design. Any help is greatly appreciated. If this is the wrong exchange please let me know.
