How exactly do I prove that I find the maximum of the function

2.8k Views Asked by At

I am currently trying to maximize an objective function $f(a,b,c,d,e)$ over the variable $b$ only.

By taking the derviative of f over b, setting it to zero, I can solve b in terms of the other 4 variables. So, $b=g(a,c,d,e) $

Graphically, if I substitude in some values of a,c,d,e into $f$ and using $g(a,c,d,e) $ from above to get $b$, I am able to see that $f$ reach its maximum value at that specific $b$. If I change the value of $b$, I can see that f decreases.

But then the question is, how do I prove that $b=g(a,c,d,e)$ is the point that maximize $f$? More specifically, I am looking for global maximum inside the interval [0,1].

PS: the constraint on b is that $b$ is inside the interval $[0,1]$


Update:

I find that there is one critical point of b inside [0,1].

I am not sure whether single variable calculus applies here. But if it does, since $b$ is inside the interval [0,1], and there is only one critical point (partial derivative of b is zero) inside this interval, then if $f$ doesn't diverge, the global maximum is either on the boundary or the critical point.

Do you guys think this is right?


Update2:

Some said using the second derivative method. But the question is, I don't have value of other 4 variables, how do I know if the second derivative is greater than 1 or less than that?

3

There are 3 best solutions below

0
On BEST ANSWER

Collect the accessory parameters $a$, $c$, $d$, $e$ into a parameter point ${\bf p}:=(a,c,d,e)$. For given ${\bf p}$ we then have to study the function $$f_{{\bf p}}:\quad [0,1]\to{\mathbb R},\qquad b\mapsto f(a,b,c,d,e)$$of the single variable $b$. If this function is continuous on $[0,1]$ and differentiable in the interior of this interval it assumes a global maximum on $[0,1]$. This maximum is found as follows: Compute the zeros of the derivative $f_{{\bf p}}'$ in $\ ]0,1[\ $. In most cases you will obtain a finite (maybe empty) set $\{x_1,x_2,\ldots, x_r\}\subset \ ]0,1[\ $. Then build the candidate list $$C_{\bf p}:=\{0,1,x_1,\ldots, x_r\}\ ,$$ and you can be sure that $$M({\bf p}):=\max_{x\in [0,1]} f_{{\bf p}}(x)=\max_{x\in C_{\bf p}}f_{{\bf p}}(x)\ ,$$ where on the right hand side you have to compare only finitely many values.

During this analysis the parameter point ${\bf p}$ was fixed. Now the list $C_{\bf p}$ will depend on ${\bf p}$, and so will the individual member of the list giving rise to the maximal value of $f$. It may very well happen that for some ${\bf p}$ the $\max$ is taken at the left endpoint of $[0,1]$, for other ${\bf p}$'s in the interior, and still other at the right endpoint.

This phenomenon is already present in the following simple example: Let $\sigma$ be the segment connecting the points $(-1,0)$ and $(1,0)$ in the plane. For a given point ${\bf p}:=(u,v)$ let $d({\bf p})$ be the distance from ${\bf p}$ to the nearest point of $\sigma$. (Draw a figure!)

A concluding remark: No second derivatives had to be calculated.

2
On

If your function $f(a,b,c,d,e)$ is continuous, differentiable and has only the point $b=g(a,c,d,e)$ inside [0,1] where the first derivative is zero, then one of the points $b=g(a,c,d,e), b= 0$ or $b=1$ has to be your global maximum.

We consider the three different cases: $b=g(a,c,d,e)$ is a maximium, a minimum or a saddle point.

Case 1: $b=g(a,c,d,e)$ is a maximum. As there is only one extreme point in [0,1] $b=g(a,c,d,e)$ is the global maximum. It is clear that $b=0$ and $b=1$ cannot be maxima.

Case 2: $b=g(a,c,d,e)$ is a minimum. Then both $b=0$ and $b=1$ are maxima, one of them is the global maximum.

Case 3: $b=g(a,c,d,e)$ is a saddle. Then either $b=0$ or$b=1$ is the global maximum.

0
On

Checking whether second order partial derivative of f(a,b,c,d,e) with respect to 'b' is greater or less than zero at maxima-minima point (By taking the derivative of 'f' over 'b', setting it to zero, I can solve 'b' = 'b0') will help us judge whether function is minimum or maximum at that combination of a,c,d,e.

Since you said you are trying to find maximum only over one variable 'b', single variable approach will be good enough.

The problem of 'saddle point' doesn't comes here as we are not trying to find maximum-minimum of a function over a 6-dimensional space (1-function depending on 5-variables, case where one variable showing f() as maximum and other as minimum doesn't arise here).

Obviously if at boundary points ('b' = 0 or 'b' = 1) f() is greater than it is at 'b' = 'b0', global maximum will be one of those points (where ever f() is more)