Let the function $f \in C^2(\mathbb{R}^n;\mathbb{R})$ have a local maximum in the point $a \in \mathbb{R^n}$.
How can one prove the following with Taylor's theorem:
The following applies: $y^tH_f(a)y \leq 0$ for all $y \in \mathbb{R}^n$, meaning that the Hessian Matrix in this point $a$ is negative semidefinite. (Necessary second derivation criteria)
I only know that the opposite is true as well, i.e. if a function $f \in C^2(\mathbb{R}^n;\mathbb{R})$ in $a \in \mathbb{R^n}$ has a local minimum, then $H_f(a)$ is positive semidefinite.
I also know the following:
I tried proving it using the table but I don't know how to prove $y^tH_f(a)y \leq 0$ with Taylors Theorem.

Abridged proof. Taylor's theorem asserts that $f(x + h) = f(x) + f'(x) \cdot h + \dfrac{1}{2} f''(x) \cdot (h, h) + o(\|h\|^2).$ If $f''(x)$ were negative definite, signifying this $f''(x) \cdot (h, h) < 0$ for all $h$ small enough, then, by taking a sufficiently small sphere of radius $\delta > 0,$ and $h$ in this sphere, we reach that the supremum $\beta$ for $h$ on that sphere is $< 0$ (by compactness of the sphere); hence by dilation $f''(x) \cdot (h, h) \leq \dfrac{\beta}{\delta^2} \|h\|^2$ for all $h.$ By definition of the little o notation, we can also assumme that $|o(\|h\|^2)| \leq \dfrac{-\beta}{2\delta^2} \|h\|^2$ for all $\|h\| \leq \delta.$ Putting all together, we reach the conclusion that (recall $f'(x) = 0$ for a critical point $x$): $f(x+h) \leq f(x) + \dfrac{\beta}{\delta^2} \|h\|^2 + \dfrac{-\beta}{2\delta^2} \|h\|^2 < f(x).$ Q.E.D.