Understanding why GeoGebra fails to show the correct range of a function over certain domains

349 Views Asked by At

I need help. Below are several images from this GeoGebra lesson, and I should understand why GeoGebra has problems showing the correct range of the function for certain domains $[a,b]$ (controlled by "$a$" and "$b$" sliders).

For the domain $[-5,-1.9]$, the plot shows not actual range, but for $[-5, -1.8]$ all seems well. The same issue for $[-5,-0.9]$ and $[-5,-0.8]$ and many times after. I understood that it's somehow connected with the second derivative of the function and "interval arithmetic", but how???

If anybody have any idea or way which can help me to find best answer please help ;-)

enter image description here

enter image description here

Here's a related Desmos project: https://www.desmos.com/calculator/aappxay2yc

1

There are 1 best solutions below

0
On

The surest way to discover why this example plots the wrong range in some cases is first to ask the person who made this page to explain the method they used in order to compute the range. Then, if you still cannot spot the error, explain it in the question so that someone else might figure it out.

But I will make a guess. By trying different functions, you can see that the upper and lower ends of the plotted "range" are always a local maximum and local minimum of the function. I think these values are found by applying some optimization algorithm that finds a local extremum, not necessarily a global extremum, and that is sensitive to the choice of a starting point. As you change the domain, the starting point changes, and (if the starting point changes enough) so does the local extremum that the algorithm finds.

When the chosen local minimum and maximum are coincidentally the global minimum and maximum, the algorithm appears to be correct, but in other cases it gives wrong answers.

If you want to know exactly why the algorithm chooses wrong answers when it does, I suggest you try to find out what algorithm was used.