I'm trying to calculate the box dimension of the Mandelbrot set's boundary, but i'm getting the wrong numbers. I'm using the wikipedia definition of dimbox, which is Dimbox(S) = Lim ε->0 of log(N(ε))/log(1/ε), where ε = box side length and N(ε) = number of boxes required to cover it.
When I get to box length = 0.01 (as in, every pixel on the border of the mandelbrot is one box) the box count is 17,964, which ends up giving a dimbox of 2.13, which should be correct.
However, as far as I understand it, you can't really calculate the dimbox by just having a really small value for ε, and it should instead be more of a trend that can be seen with a linear regression when plotting log(N(ε)) on the y-axis and log(1/ε) on the x-axis for many values of ε. When I do this, I have the points
(log(1/2),log(45))
(log(1/1),log(122))
(log(1/0.5),log(315))
(log(1/0.01),log(17964))
which when plotted give this result (Plot of points).
However, this slope ends up being approximately 1.2, whereas by definition the box dimension should be at least 2, as Dimh <= Dimbox, and Dimh of mandelbrot = 2.
Would it be accurate to just have the smallest value of ε for the proof, or would it be required to have all values leading up to it and doing the linreg?
Did I do any mistakes in my calculations that are giving me this inaccuracy?