Finding global extrema of $a\left(\frac{1}{2}-b\frac{\sin(cx)}{x} \right) - b(1-\cos(cx))$, for $x\geq 0$

94 Views Asked by At

I have the following equation: \begin{equation} f(x) = a\left(\frac{1}{2}-b\frac{\sin(cx)}{x} \right) - b(1-\cos(cx)), \quad x\geq0, \end{equation} where a, b and c are strictly positive constants.

When I plug this into WolframAlpha to solve for min/max of $f(x)$ for $x>0$, it returns no global min/max is found. However, from all of the figures that it provides, the global min and max appears to be after to be within the first "cycle" - see figure. The numeric also confirms this, but I am struggling to find out how to prove that it is the case (if it is).

Any hint would be much appreciated. Here is the link for the numerical approximations of those peaks.

plot of $f(x) with a=0.3, b=0.5/4, c=2.

1

There are 1 best solutions below

4
On

Plugging in $a, b, c = 1$ gives the equation $$-0.5-\frac{\sin(x)}{x}+\cos(x)$$Simplifying the derivative of this equation and setting it equal to zero yields $$(1-x^2)\sin(x)-x\cos(x)=0$$ Which although I cannot solve, Wolfram Alpha can, yielding the first positive solutions corresponding to local maxima to be x = 6.11676 and x = 12.48594. Plugging in these values back into the original equation yields y = 0.513 and y = 0.503, proving the conjecture false. Why, you may ask? I don't have a rigid proof, but since the factor of sin(x) is being divided by x, it's average value decreases over time, so the maxima values cannot be constant. The only possibility is if the sin term is zero for each maxima, so I do believe that you can construct a function that does follow your hypothesis but clearly not all functions do.

Whoops, I misread the question a bit. Simplifying the equation and removing unhelpful constants leads to the equation $$b(-a\frac{\sin(cx)}{x}+\cos(cx))$$ where we can effectively ignore $b$. As we take the limit of $x$ to infinity, the maxima occur near $x=\frac{2\pi}{c}k$. How much larger is the maxima of the whole function than the maxima of $\cos(cx)$? Comparing slopes of $-a\frac{\sin(cx)}{x}$ near $x=\frac{2\pi}{c}k$, gives $\frac{-abc^2}{2\pi k}$, or a decreasing magnitude when x increases. Since the magnitude is larger near zero, the maxima where the slopes of $-a\frac{\sin(cx)}{x}$ and $\cos(cx)$ are equal is larger when x is closer to zero. This is not a very rigorous proof, but it is an idea that does prove decreasing maximas for large enough x.