Proof of Fermat's Theorem for Extrema

592 Views Asked by At

I'm trying to follow a proof of Fermat's theorem for extrema -- that if some function $f$ has a local minimum at $(a,b)$ and its first partial derivatives exist, then they're equal to zero. The proof in the textbook I'm using doesn't quite make sense to me (or, at least, it's very first step).

It first defines a function $g(x) = f(x,b)$. I don't quite understand this step, as it seems we would want to work with a function of $x$ and $y$, and since we're using $(a,b)$ as a point, these are effectively constants. It also seems that we're defining a single variable function, $g$, as a two-variable function, $f$.

If we accept this step for just the moment, it argues that $f$ having a max or min at $(a,b)$, then $g$ has a max or min at $a$. Since the first step is still quite confusing, I don't quite understand how this translates, though it seems that the authors simply took $x = a$. From here, they deduce that $g$ have a max or min at $a$ implies $g'(a) = 0$, $g'(a) = f_x (a,b)$, and thus $f_x (a,b) = 0$. They then use the same argument to show that $f_y (a,b) = 0$.

I think my main confusion is in the first step, but I also can't quite see how a max or min at $(a,b)$ translates into a max or min at $a$.

Thanks in advance for any insights on this.

1

There are 1 best solutions below

5
On BEST ANSWER

The proof is correct. We are assuming that $f$ has a local minimum at $(a,b)$. There's nothing wrong with defining $g(x)=f(x,b)$, right?! The function $g$ is a function of a single variable $x$ and, since when $(x,y)$ is close to $(a,b)$, $f(x,y)\geqslant f(a,b)$, then, in particular, when $x$ is close to $a$, $g(x)\geqslant g(a)$ (since $(x,b)$ is close to $(a,b)$). So, the single variable function $g$ has a minimum at $a$. Since it is differentiable, its derivative at $a$ is $0$. But, by definition of partial derivative, $g'(a)=f_x(a,b)$.