I have trouble solving the following problem algebraically.
Smith lends $\$1000$ to Jones at time $t=0$. Jones is supposed to repay Smith by paying $\$100$ at time $t=1$ and $2$, and $\$1000$ at time $t=3$. However, Simth sells the right for the money being paid back to Brown at time $t=1.5$ for $\$1000$. Let $j$ be the interest rate rate earned per unit time for Smith and $k$ be that of Brown. Which is larger, $j$ or $k$?
Intuitively, clearly $j>k$ because Smith gets back his $\$1000$ earlier than Brown, and they both receive an extra $\$100$.
However, I cannot prove this inequality for some reason.
The following is what I tried.
The total amount Smith is going to have at $t=4$ , $S$, can be calculated as
$$S=-1000(1+j)^4+100(1+j)^3 +1000(1+j)^{2.5}$$
Likewise, the total amount Brown is going to have at $t=4$ , $B$, can be calculated as
$$B=-1000(1+k)^{2.5}+100(1+k)^2+1000$$
I don't know the relationship between $S$ and $B$ besides the fact that $S>B$. Can someone help me out?
I have a feeling that I am not doing this problem correctly.
For me the best way of thinking about problems of this kind is to notice that the unknown interest rate $j$ represents the interest rate for the account with the same cash flow, i.e. to imagine Smith pays into his account (with interest rate $j$ per annum) 1000$\$$, then after one year withdraws 100$\$$, and after 1.5 year he has in the account $1000\$$. We get:
$(1000(1+j) - 100)*(1+j)^{0.5}=1000$ and this an be easily solved for $j$.