Consider the functions $f, g : \mathbb{R}^2 \rightarrow \mathbb{R}$ defined by
\begin{align*} f(x, y) & = x^2 - 7y^2 - 1 \\ g(x, y) & = x^2 + (y + 24)^2 \end{align*}
Then I have to determine whether or not a maximum or minimum of $g(x,y)$ exists under the condition that $f(x,y) = 0$ where I have to use Lagrange.
My solution:
We have the Lagrange function
$$ \mathcal{L} = x^2 + (y + 24)^2 + \lambda(x^2 - 7y^2 - 1 - 0) $$
Thus, the first order conditions:
\begin{align} \frac{\partial \mathcal{L}}{\partial y} & = 2 (y + 24) - 14 \lambda y = 0 \\ \frac{\partial \mathcal{L}}{\partial x} & = 2x + 2 \lambda x = 0 \\ \frac{\partial \mathcal{L}}{\partial \lambda} & = x^2 - 7y^2 - 1 = 0 \end{align}
From $2x + 2 \lambda x = 0 \Leftrightarrow \lambda = -1$ which yields $2 (y + 24) - 14 \lambda y = 0 \Leftrightarrow y = -3$.
Thus $x^2 - 7 (-3)^2 - 1 = 0 \Leftrightarrow x = -8 \lor x = 8$.
As $\lambda = -1$ we know that the $f(x,y) = 0$ binds in optimum which implies that we can solve for $x$ and insert this into $g(x,y)$ and thereby only have a function of one variable. From $f(x,y) = 0$ we get $x = \sqrt{7y^2 + 1} \lor x = - \sqrt{7y^2 + 1}$ which we substitute into $g(x,y)$ to get $g(x,y) = g(y) = 8y^2 + 48y + 577$. The derivative is then $g'(y) = 16y + 48$ and if we check for values of $y$ less than $-3$ and greater than, we find that $g'(-4) = -4$ and $g'(-2) = 16$ which means that we have found a minimum. However, now I have only checked the sign of $g'(y)$ for the value of $y = -3$.
To me it would have made more sense to check the sign of $g'(x)$ for the value of $x = -8 \lor x = 8$ but is this approach OK?
Thanks in advance.
The process by which you obtain your optimal points seems appropriate. Also, for numerical calculations such as these, Wolfram has a nice widget that does optimization via Lagrange multipliers here:
https://www.wolframalpha.com/widgets/gallery/view.jsp?id=1451afdfe5a25b2a316377c1cd488883
Of course, work the problem out first for yourself and then compare afterwards!
As it pertains to checking whether or not these points are maxima or minima, I would suggest reviewing first and second order optimality conditions (i.e., what does the Hessian look like for $g$?). One good (advanced) reference is Foundations of Optimization by Osman Güler (pp. 35-40). However, the following Wikipedia page also suffices:
https://en.wikipedia.org/wiki/Second_partial_derivative_test
Note that in the Wikipedia page, for the case of "Functions of two variables", the formulas used there 'work', but the reasons why they work require a bit of thought (see the section below it called "Functions of many variables" that generalizes your case). I would encourage you to look at the more general conditions if you have the time (this has to do with notions of $\textit{definiteness}$ in the Hessian for $g$).