$$A_{\lambda}(f) = B(f) + \lambda C(f)$$ $\lambda \in (0,\infty)$. $A_{\lambda},B,C$ are non negative, quadratic and convex functionals.
Let $f_{\lambda}$ be the minimizer of $A_{\lambda}(f)$ over a convex set $S$. Now I need to maximize $D(f_{\lambda})$ over $\lambda \in (0,\infty)$, again $D$ is non negative quadratic and convex functional.
I want to eliminate the need for $\lambda$ and combine into a single problem. Otherwise I cannot solve this problem.
So I thought of combining these two problem into single problem minimizing $$J(f) = \frac{B(f)+C(f)}{D(f)}$$ over set $S$. (I have taken $\lambda = 1$, as its no longer necessary, I am not sure if I am correct in doing this).
Is this a correct approach? In case it is, a pitfall is $J(f)$ is not convex. So is there a way out, other than this approach of combining these two problems in this way?
The solution of $J(f)$ (if you were able to find it) might not be a minimizer of $B(f) + \lambda C(f)$ for any $\lambda$, so minimizing $J(f)$ will not necessarily minimize the original problem.
I'd suggest you try finding the analytic expression of the $f$ that minimizes $A_\lambda(f)$, as a function of $\lambda$. Given that $A$ and $C$ are quadratic and $S$ is convex, you might find an easy expression for that (without knowing more about $A$, $C$ and $S$ I can't propose one at the moment). So, find the form of $g(\lambda)$ so that:
$g(\lambda) = \text{argmin}_f\quad A(f) + \lambda C(f)\quad \text{s.t.} \quad f \in S$
Once you have that expression, try minimizing $D$ but not as a function of a general $f$, but as a function of $g(\lambda)$, so:
$\text{max}_\lambda\quad D(g(\lambda)) \quad \text{s.t.} \quad \lambda\geq 0$
Feel free to share more details if you have them about the specific problem you are trying to solve!