This I assume is a very simple question but can't really wrap my head around it.
So the question is:
If 100 dollars is deposited at time t = 0 into an account earning 10% interest and $20 is withdrawn at t = 1 and 2, then how much can be withdrawn at t = 3?
I tried getting the answer by doing this: $$((100(1.1) - 20)(1.1)-20) = 79$$
But the solutions state:
$$100 (1.10)^ 3 − 20 (1.10)^ 2 − 20 (1.10) = 86.90$$
I have no idea of how this is correct.
p.s. This is my first post here. If there was something I did wrong, please let me know.
Just go period by period.
$t=0$: you put $\fbox {100}$ in the bank.
$t=1$ the $100$ has grown to $110$. You take out $20$, leaving you with $\fbox {90}$ in the bank.
$t=2$: the $90$ has grown to $99$. You take out $20$ leaving you with $\fbox {79}$ in the bank
$t=3$: The $79$ has grown to $79\times (1+.1)=\fbox {86.9}$