This problem might be trivial but when solving it using calculus of variations it's not so stupid.
Suppose we have a fixed boundary condition $f(a) = f(b) = 0$ and we want to find the shortest distance between two points, so we choose $f$ to minimize the functional $$ I(f) = \int_a^b \sqrt{1+f'(x)^2} dx $$ we solved this in class and got $f'' = 0$ which is a straight line. However, could we have just solved the problem $$ \min I(f) = \int_a^b f'(x)^2 dx $$ since it is a monotonic transformation of the original function? I know this works in calculus but not sure about when dealing with functionals. It gives me the same answer ($f'' = 0$) but is this just coincidental? The second problem involves significantly less work. Thanks!
This is an extended comment, not an answer, just to share my thoughts.
You mention that monotone transformations of functions yield equivalent optimization problems. This is true because, taking $g(x)$ to be monotone increasing so that $g'(x)>0$ for all $x$ in the domain, $$\frac{d}{dx}g(f(x))=g'(f(x))f'(x)=0\iff f'(x)=0$$ since $g'(x)$ is never zero.
But I simply cannot find a way to extend this argument to the case of variational problems. If we have a solution to the transformed problem with $g(f(x,y,y'))$ (in your case $g(t)=t^2-1$ and the domain is $t\ge 0$) then we know that $$\frac{\partial }{\partial y}g\big(f(x,y,y')\big)=\frac{d}{dx}\frac{\partial }{\partial y'}g\big(f(x,y,y')\big)$$ which expands to $$g'(f)\frac{\partial f}{\partial y}=g''(f)\frac{df}{dx}\frac{\partial f}{\partial y'}+g'(f)\frac{d}{dx}\frac{\partial f}{\partial y'}.$$ We want this to imply that $$\frac{\partial f}{\partial y}=\frac{d}{dx}\frac{\partial f}{\partial y'}$$ so that we can conclude (under suitable conditions) that we have optimized the original problem. But I can find no way to realize that implication, given only that $g'(t)>0$.
Clearly the proof is done if we have $g''(t)\equiv 0$, that is, if $g$ is a linear function. Unfortunately yours is quadratic.