Rate of Percentage Charge

31 Views Asked by At

So I was wondering how long it took my iPad to go from 60% to $100\%$ since I got a new USB port. I found it took $216$ minutes, or about $3.6$ hours. My goal was to figure out the time it takes to charge $1\%$ (namely in minutes), assuming of course, the rate of charge is consistent. I also tried backtracking the math to figure out how long it should take to go from $0\%$ to $100\%$ (based on the fact that $\frac{.40}{216}=1\% \text{ per} \ 540 \ \text{minutes}$), for example, but I realized I can't trust those results if I can't figure out a conventional rate per minute!

So I figured $$\frac{0.40}{216}\propto \frac{0.01}{9}$$

I decided to represent this algebraically where the percentage of charge, $P$, is a function of time in minutes, $t$

Which should mean

$$P(t)=\frac{0.01}{9}t$$

So I tested this function with the actual data from above $$P(216)=\frac{0.01}{9}(216) \implies P=0.24$$

Clearly, $24\%\ne40\%$

I figured no calculus would be needed because I'm assuming rate of charge is consistent/linear. What am I missing here? Why is my function not working?