The answer to this question says that a MacLaurin polynomial of a function $f$ is the unique $N^\text{th}$ order polynomial that minimizes the following functional:
$$ L[g] = \lim_{x\rightarrow 0}\left(\frac{f(x)-g(x)}{x^N}\right)^2, $$ and that the minimum value attained is $0$, $$ \min_{g} L[g] = 0 $$
I'm wondering if anyone has a reference for a proof of this fact in a forward sense, i.e., starting from the above, show that $g$ must be equal to the formula for the MacLaurin series of $f$, without directly invoking some form of Taylor's theorem to expand $f$. Basically, assuming we didn't know anything about Taylor/MacLaurin series or polynomials, can we derive them from the above minimization problem?
Possible path
We know $g$ is an $N^\text{th}$ order polynomial: $$ g(x) = \sum_{n=0}^N a_n x^n = \vec{X}\cdot\vec{a}, $$ where $$ \vec{X} = \left[1\;\; x\;\; x^2\;\;\ldots\;\;x^N\right]^\text{T}, $$ and $$ \vec{a} = \left[a_0\;\; a_1\;\; a_2\;\;\ldots\;\;a_N\right]^\text{T}. $$ Now I know I'll need to know the sensitivities of that functional to the parameters: $$ \nabla_\vec{a} L[g] = \left[ \frac{\partial L[g]}{\partial a_0}\;\; \frac{\partial L[g]}{\partial a_1}\;\; \frac{\partial L[g]}{\partial a_2}\;\;\ldots\;\; \frac{\partial L[g]}{\partial a_N} \right]^\text{T} $$
And that through application of the chain rule, I'll also need that $$ \nabla_\vec{a} g = \vec{X}. $$ Setting the gradient of the functional w.r.t. $\vec{a}$ to zero for each component should give me a way to solve for $g$, I just don't see a clear path forward that results in the formula for a Taylor polynomial. I hope someone can illuminate the way.
Suppose $f(0) \ne g(0)$. Then the numerator of the limit is finite while the denominator goes to zero, so the limit diverges. Therefore, $L[g]$ can only be finite if $f(0) = g(0)$.
Apply L'Hôpital's rule and we have $$L[g] = \lim_{x\to0}\left(\frac{f'(x)-g'(x)}{Nx^{N-1}}\right)^2.$$ By the same logic, for $L[g]$ to be finite we must have $f'(0) = g'(0)$.
Repeat $N$ times, obtaining the conditions $f(0) = g(0)$, $f'(0) = g'(0)$, $f''(0) = g''(0)$, ..., $f^{(N-1)}(0) = g^{(N-1)}(0)$, until finally $$\begin{align} L[g] &= \lim_{x\to0}\left(\frac{f^{(N)}(x)-g^{(N)}(x)}{N!}\right)^2 \\ &= \frac1{N!}\bigl(f^{(N)}(0) - g^{(N)}(0)\bigr)^2, \end{align}$$ which is clearly minimized at $f^{(N)}(0) = g^{(N)}(0)$. Therefore, $f$ and $g$ must have the same $0$th to $N$th derivatives.