For example, let's consider these two min max optimization questions
(1) $$\max \ \ g(x,y)=xy$$ $$ \text{s.t. } x^2+y^2=1$$
(2) $$\min \ \ g(x,y)=xy$$ $$ \text{s.t. } x^2+y^2=1$$
Solution:
By using Lagrange multiplier,
For max problem (1),
$$L= g(x,y) +\lambda (x^2+y^2-1)$$
For min problem (2),
$$L= g(x,y) -\lambda (x^2+y^2-1)$$
What is the difference between max and min problem solution?
Is the Lagrange function I wrote above correct for max and min problems?
Please inform me on this subject. I am confused too much:(
By Daniel Robert Nicoud's answer, I try to solve the question
I took second derivatives (Heissan Matrix) please explain meanings of "less than zero " and "bigger than zero"



Actually, the Lagrange multiplier method is used to find critical points, so you can use the same sign for both. The distinction between minima, maxima and saddle points is given by the Hessian matrix.