I have been trying to apply the finite difference method to a given diffusion equation (heat, population, etc.) that includes a generating term at a specific point inside the material. My problem deals specifically with defining a conservative Neumann (no flux) boundary condition at the edges, say x=0 and x=L for a 1D diffusion. In other words, for heat diffusion I am saying all of the generated heat in the material stays inside forever (perfectly insulated).
However, when trying to find a representation for the second derivative at the boundary I ran into an interesting problem.
Using the standard finite difference method at, say, x=0 ($f'(0)=0$) I have:
$$ f''(x)=\frac {f'(x+h)-f'(x)}{h}=\frac {f'(x+h)}{h}=\frac {f(x+h)-f(x)}{h^2} $$
Where $x=0$.
However, using the Taylor expansion method, I have:
$$ af(x)+bf(x+h)=(a+b)f(x)+hbf'(x)+b\frac {h^2}{2}f''(x)+O(h^3) $$
With the requirements that:
$$ a+b=0, f'(x)=0, b=\frac {2}{h^2}=>a=\frac {-2}{h^2} $$
Gives the following results for the second derivative:
$$ f''(x)=\frac {2}{h^2}(f(x+h)-f(x))+O(h) $$
Notice that there is a factor of two difference between the two methods.
Using numerical simulations I've found that only the first version allows for all the heat to remain in the material whereas the second one slowly removes heat from the system.
Is there something wrong with my treatment of the Taylor method? I've tried going to higher order terms for the Taylor method and still end up with a non-conservative result.