I've been trying to solve an optimization problem, but until now I haven't been able to provide a decent solution. The preamble is the following: I need sufficient conditions for which an event happens with a certain probability.
- We have two (strictly positive) variables here to play with, $m$ and $t$; and two parameters, fixed but otherwise unknown, $p$ and $\delta$. I need to find the minimum $m$ which both makes the event happen and grants the given probability $p$: Pr$\{E(\delta)\}\geq p$
- In order for the event "E" to happen, we need $m\geq m_1(t,\delta)\triangleq\alpha(\delta)+\beta(\delta)t+\gamma(\delta)t^2$; greek letters are non-negative functions of $\delta$, thus, fixed positive parameters.
- In order to have an acceptable probability, we need $t\geq t_1(p) $
With this in mind, the first approach is to solve the following:
minimize $\quad m$
subject to:
$m\geq m_1(t,\delta)\rightarrow$ for the event E to happen.
$t\geq t_1(p) \quad \,\,\,\,\rightarrow$ for the necessary probability.
This problem is solved by setting $(m,t)=(m_1(t_1),t_1)$. A graphical interpretation could be seen here: Graphical Solution Look at point 'D'.
My main problem is that the minimum, $m_1(t_1)$ is not low enough for what I want (a little bit too vague, I know :( ). What's even worse, if I disregard the probability restriction, and just 'hope' to be lucky enough, the lowest $m$ I could get would be $\alpha$ (setting $t=0$), and this value is still too high.
I know I'm changing the rules a little bit, but the only way out is to allow for $t$ to be a function of $m$ (weird, but I'm allowed to do that). This way the intersection with $t=t_1$ in the image can move freely. Following that line, I would hope that if $dm/dt<0$ under some conditions, I could find a $m<<\alpha$ which both fulfills the event and probability restrictions.
This new problem is stated as follows:
$\underset{\{m,t(m)\}}{\text{minimize}}\quad m$
subject to:
$m\geq m_1(t(m),\delta)\rightarrow$ for the event E to happen.
$t\geq t_1(p) \quad \,\,\,\,\rightarrow$ for the necessary probability.
$t(m\geq m_{\text{argmin}})\geq 0$
As I am looking for sufficient conditions, I expect that the solution will depend on the Greek letters. I don't know nor think the problem is convex, but KKT should be enough to extract the conditions (and hopefully the optimum $(t_{argmin},m_{argmin})$).
- Should I keep going on with writing the Langrangian and solving KKT conditions? If that is the case, where do the Lagrange multipliers live? Are they scalars or functions? Should I try something like frechet derivative for $t(m)$? In all those cases, I think at most I would get information about $t_{argmin}(m_{argmin})$ and $dt_{argmin}(m_{argmin})/dm$, but not the whole function $t_{argmin}(m)$. Should I even care about it?
- Is there another way to find such a function? ($t(m)$).
The solution to this exact problem is written here. The question remains, however, on how to look for functions on such a scenario. The existence of such a function $t(m)$ can be thought of as the addition of an additional constraint.
So, we had an original problem, with some constraints, and a new problem, with one more constraint. This must lead to one of the three scenarios:
Summary: Considering a new function between the variables is just an additional restriction, so there is no way the optimum can be better.