I have a set of points I've generated programmatically. For each of about 200 randomly spread X values in [0;400) range there's a Y value. Here's what the resulting graph looks like (generated in Excel): 
In fact, this graph shows how distance traveled depends on time, with constant acceleration, starting speed equal to zero, and a speed cap present. The graph begins as a parabola, and at some point, where the speed cap is reached, becomes a line. Measurements were taken graphically from a video game, so the graph is a bit wonky.
I've tried using Excel trendlines to approximate a parabola in different ranges; at [0;200] the trendline seems to be accurate, and at [0;400] it's clearly out of place, but I can't think of a way to determine where it's no longer accurate. Is there a way (mathematicaly and/or with Excel) to calculate with any precision where the graph turns from parabola to a line?
Edit: I have a set of roughly 200 points: $\{(x,y): x\in[0;400)\}$. These points belong to a graph that takes form of $ax^2+bx+c=0$ on $x\in[0;t)$ (parabola) and $y=bx+k$ on $x\in[t; 400)$ (straight line). I need to calculate the most likely value of $t$. The issue is complicated by the fact that the points aren't strictly on the graph, but only approximate to it (see image above).
Let the acceleration be $a$ m/s$^2$. The speed will then be equal to $a*t$ m/s, where $t$ is time in seconds. The max speed, $q$, will be reached at time $t = q/a$. This means that the graph will turn from a parabola to a line at time $t = q/a$.
In fact, before $t = q/a$, the graph will be $y=at^2/2$, with $y$ being distance. Afterwards, it will be $y = q*x-q^2/(2a)$.