This question pertains to programing but is really a math question.
I am building an application that draws a line graph, similar to a stock line graph. The problem is I am not starting with a known set of numbers. The values will come in based on an altitude value. So, when I start the graph I have no idea what the max and min values of my data set will be, because they max might not come along for some time.

Like I mentioned, these values will be added to the graph when I receive and elevation value but I don't know these until they are received from my device (iPhone).
Is there a formula used to calculate a line graph for unknown values? How can I know what x and y values to use for each new value received? And how can I calculate the min and max of the graph?
The usual approach is to choose some default scale for your graph and rescale if the data goes out of range. Maybe you start with 0-1mi on x and current altitude (rounded) +-200 ft on y. Then when somebody goes past 1 mi, change the horizontal (which requires updating all the points) to 0-2 mi
ometimes people just cover the range of the current graph. So if somebody starts out on very flat ground, y could be only +-15 feet more or less. The bad news is it jumps around a lot. Otherwise you can just start with the idea that "everybody" will change in elevation by 100 feet and set that as a minimum scale, increasing it as required. Really x and y work the same way in this regard. – Ross Millikan 1 hour ago