I have a question regarding an annuity loan calculation and I would like to prove whether the hypothesis I am stating is correct:
Consider an annuity loan $L_{1}$, with a principal of $T_{1}$ and an interest rate of $i_{1} = (5.5\% /12)$ per month and a duration of $D_{1}=1$. The annuity would be $J_{1} = \frac{i}{1-(1+i)^{-n}} \cdot T \approx 85.84$.
If the borrower is unable to pay $J_{1,1}$, but the borrower can pay $Lim<J_{1}$ per month, then the lender is willing to provide an additional loan $L_{2}$ to cover the difference, with a principal of $J_{1}-Lim$ against the same interest rate $i_{1}$ and for $D_{1}$. For the second month, a new loan is created $L_{3}$ to cover the difference of the second month and so on. In addition in the newly created loans, when the borrower in that month cannot pay the loan back, again a new loan is created for that loan used to previously cover the difference in $Lim$ and $J$. This goes on until the loan is repaid.
Hypothesis: I hypothesize that the "recursive loan coverage" process (first case) is actually equal to the $L_{1}$ with only a longer duration and $J=Lim$ (second case) and therefore give the lender the same returns (i.e. the total amount of interest paid in the first case is the same as the second case).
Since this is a toy example in my head, I haven't been able to prove this hypothesis yet. But I wonder whether someone can give me insights on how to deal with the recursive loan coverage process. I can imagine that this might not be the same, but then I am wondering whether there is a solution/formula to show this difference.