I have a function $f(x)$ for which finding the optimum (maximum) appears to be analytically intractable and numerically difficult. I have simple expressions for upper and lower bounds on this function ($f_{LB}(x) \leq f(x) \leq f_{UB}(x)$) for which I can find the optima numerically. Let the maximum of the lower bound be denoted $x^*_{LB}$, the maximum of the upper bound be denoted $x^*_{UB}$, and the maximum of the original function be denoted $x^*$. Based on simulations, it appears that $x^*_{LB} \leq x^* \leq x^*_{UB}$. How would one go about proving this? If general techniques are not applicable, are there specific instances of people solving similar problems?
More specifically, the function $f(x)$ involves a sum of a sequence of random variables $\{Y_i\}$:
$$f(x) = \mathbb{P}\left(\sum_{i=1}^x Y_i > x\right) \frac{x}{x+c}$$
where $c$ is some constant and the sequence of RVs is positive and has a monotonically decreasing mean: $\mathbb{E}[Y_j] \geq \mathbb{E}[Y_k]~\forall~j\leq k$. Let's define $S_x=\sum_{i=1}^x Y_i$. The probability in this function can be bounded using some simple concentration inequalities:
$$ \mathbb{P}\left(\sum_{i=1}^x Y_i > x\right) \leq \frac{\mathbb{E}[S_x]}{x} \text{(Markov)}$$
$$ \mathbb{P}\left(\sum_{i=1}^x Y_i > x\right) \geq \frac{(1-x/\mathbb{E}[S_x])^2 \mathbb{E}[S_x]^2}{\text{Var}[S_x]+(1-x/\mathbb{E}[S_x])^2 \mathbb{E}[S_x]^2} \text{(Paley-Zygmund)}$$
Then
$$f_{UB}(x) = \frac{\mathbb{E}[S_x]}{x+c} $$
and
$$f_{LB}(x) = \frac{x(1-x/\mathbb{E}[S_x])^2 \mathbb{E}[S_x]^2}{(x+c)(\text{Var}[S_x]+(1-x/\mathbb{E}[S_x])^2 \mathbb{E}[S_x]^2)}$$
Below is a plot which illustrates the presumed relationship. I've run this simulation many times with many parameters, and have yet to come across an instance in which $x^*_{LB} \leq x^* \leq x^*_{UB}$ was not true, though clearly this is not sufficient proof.

What you would need would be $f'_{LB}(x) \le f'(x) \le f'_{UB}(x)$, with $f'_{LB}(x) > 0$ for $x < x^*_{LB}$ and $f'_{UB}(x) < 0$ for $x > x^*_{UB}$.