In my textbook definition of the second derivative test for local extremes it says that $f''$ needs to be continuous near a critical point $c$.
What I figured out is that it is enough for it to be continuous at the point $c$ itself and not around it. Let's say $f''(c) = a>0$ is continuous, that means $\lim\limits_{x\to c}f''(x)$ exists and is equal to $f''(c)$. I will choose a very small epsilon so that $f''(c)-\varepsilon>0$ and of course $f''(c)+\varepsilon>0$. Now since the limit exists we can find delta such that if $|x-c|<\delta$ then $|f''(x)-f''(c)|<\varepsilon$. This can be rewritten as
$$f''(c)-\varepsilon<f''(x)<f''(c)+\varepsilon$$
And since I chose $\varepsilon$ to be so small that both sides of the inequality are greater than zero, then $f''(x)$ is never less than zero in an interval $(c-\delta , c+\delta)$, so $f'(x)$ is never decreasing there and since $f'(c)=0$ and $f''(c)>0$, then at $c$, $f'(x)$ started increasing and never decreased in the interval $(c+\delta , c-\delta)$.
So who is right: the textbook that says it's necessary for $f''(x)$ to be continuous near (around) $c$ or my idea that it is enough for $f''(x)$ to be continuous at $c$ itself? And a similar logic applies when $f''(c) =a<0$.