Suppose I have a function $f(x)$ which is to be integrated in the interval $[a,b]$ using step size $h$. Also, $b = a+2n\cdot h$ and $n \in \mathbb{N}$
I am unable to understand how exactly does the ordinary Simpson's rule have an order of accuracy 3 in some cases, while 4 in others (it says, when the points are distributed symmetrically in the interval). I tried looking it up online but either there's no explicit proof, or they don't mention it at all. Is there any book / resource that explains this well?