Could anyone give me a general overview of the algorithm computers use to draw graphs?
I'm guessing it is either by graphing many, many points and just connecting them or by doing a general study of the function with the derivative. However, just taking a derivative is not enough to know the exact behavior of the function in an interval, since you would have to evaluate each point of the interval in the derivative, leaving us with the first method again...
So how is it done?
There are many different plotting libraries, and they all work differently.
But in general, a graph requires data, usually in the form of x-axis and y-axis data.
Even if you pass a symbolic function, (e.g.
plot(x^2,x=0..10)), typically the computer discretizes your domain, computes a finite set of vectors, and plots straight lines between those points. So (x1,y1) connects to (x2,y2); (x2,y2) connects to (x3,y3); etc.Sometimes, the plotting library is smart enough to know the display window size and the monitor resolution, and it chooses points that are sufficiently close that you cannot tell that it is a series of straight lines.
However, some libraries know that we like smooth things. So they do some sort of spline interpolation of the data to make it happy.
Things get a little more interesting when talking about scalable vector graphics (SVG). In SVG, the function and/or the data can be converted to a parameterized vector-valued function. Then, the plotting library plots the vector as it sweeps through its parameterization, often by painting pixels directly.
Back in the day, computers were only capable of displaying text in a fixed-format display. How did computers plot functions? By using ASCII characters and putting -, |, \, and / symbols in the appropriate location! If you dig up some engineering papers from the 60s and 70s you might even find some of these plots.
In the end, there are hundreds of plotting libraries, and all of them have different ways of representing the data.