Given a data-set $x$ and $y$.
x | y
------------------
153,000 | 0.058848
332,641 | 0.36352
506,629 | 0.53
If $x$ being the number of database records and $y$ the time taken in seconds to retrieve them, how do i estimate the time taken for 10,00,000 records considering the rate of time progression from the given data-set?