Suppose I have a function defined on an interval $f:[a, b] \to \mathbb{R}$, for which I know the values associated at some evenly spaced grid-points $x_0, x_1, ..., x_{n - 1}$ where $x_i = a + i\Delta x$ and $\Delta x = (a - b)/(n - 1)$. Call the values the function takes at these points $f_i = f(x_i)$. Say we want to estimate $\left. d^4 f/dx^4 \right|_i$. In general, we would Taylor expand $f$ about some number of neighboring grid-points to create a system of equations, then combine them to cancel the lower derivative terms (and some higher derivative terms, depending on what order we care to make the approximation). A second order central difference approximation yields: $$ \left. \frac{d^4 f}{dx^4} \right|_i = \frac{f_{i - 2} - 4f_{i - 1} + 6f_i - 4f_{i + 1} + f_{i + 2}}{\Delta x^4} + \mathcal{O}(\Delta x^2) $$ Now, if we have Neumann boundary conditions, it seems like we have one of two options for the $i = 1$ and $i = 0$ points: either we can sample more points to the right and redo the Taylor expansion procedure (noting that we do not have to cancel the first derivative term at the boundary, because we know its value), or we can estimate values the function would take outside of the interval by using a finite difference scheme for the first derivative at $x = a$. This would yield $$ \left. \frac{d f}{dx}\right|_i = \frac{f_{i + 1} - f_{i - 1}}{2 \Delta x} + \mathcal{O}(\Delta x^2) $$ If this derivative is zero, this yields $f_{i + 1} = f_{i - 1}$, which for $i = 0$ yields $f_{-1} = f_{1}$. In this way, we have added "ghost points" to our grid, and we may use the central finite difference scheme to estimate the fourth derivative at $i = 1$. I assume something similar can be done to estimate $f_{-2}$ to second order.
I have two questions about this: first, am I understanding the semi-rigorous underpinnings of this correctly, or is there a different way to understand adding these ghost points? Additionally, which method ought one choose for a given situation and why? I have found the "ghost points" method to be much notably more accurate, though I couldn't tell you why.