I have an optimization problem of a time-series. I am essentially trying to determine the weights to various investments based on the interest rate that investments pays at the time of the investment ... with the goal of maximizing the interest earned.
Below is a simplification of the problem:
\begin{array}{11} \text {maximize} & \sum_{i,j=0,0}^{m,n}RATE_{ij}*INVj \\ \text {where} & \text {i = period in time} \\ \text {} & \text {j = type of investment} \\ \text {} & \text {RATE = interest rate as a percentage} \\ \text {} & \text {INV = amount invested} \end{array}
The constraints are somewhat complicated and not that material to the question necessarily. The key to the constraints is that the prior periods impact the funds available to invest in the current period.
This problem is quite easy to solve via LPP, if you assume interest rates are known (i.e. you make an interest rate projection).
Still, predicting interest rates is inherently difficult. I have a gut feeling that there must be a way to run multiple interest rate scenarios into a program and find a path (or group of similar paths) that is optimal across the range of rate scenarios considered.
Thus, you would have a path that was resilient regardless of the interest rate outcomes.
Hoping someone could help point me in the right direction in terms of learning. Is this still an LPP problem? Or some other type of algebraic solution? Or should i be considering machine/deep learning techniques?