I would like to find a polynomial $P(x)=\sum_{d=1}^D P_dx^d$ of degree $D$, where its derivative is larger than or equal to a given pdf $f(x)$ in $[0,1-\epsilon]$, for any $\epsilon>0$. Note that we can calculate the CDF, $F(x)$, and F(0)=0, and F(1)=1. Furthermore, both $f(x)$ and $F(x)$ are monotonically increasing with $x$ in [0,1] and are convex. Alos, $\lim_{x\to 1} f(x)=\infty$. The optimization problem is given below:
$minimize~~ \sum_{d=1}^D d P_d$
s.t.
$i)~ P'(x)=\sum_{d=1}^D P_dx^{d-1}\ge f(x)$ for $0\le x\le 1-\epsilon$
$ii)~ \sum_{d=1}^D P_d =1$
$ii)~ P_d\ge 0$.
Is there any general theorem that explain this optimization problem? Is there any way to prove that there exists a $D$ for any $0<\epsilon<1$, where the above optimization is feasible? I can solve the problem using linear programming, but I need to mathematically prove the feasibility of this optimization problem. I would really appreciate if you can help me.