In the context of research on $f$-divergences, I am interested in the following problem.
Suppose we have a convex function $f:(0,+\infty)\rightarrow (-\infty,+\infty]$, let $f(0)=\lim_{t\to 0} f(t)$ and $0 f(1/0) = \lim_{t\to 0} tf(1/t)$. It can be proved that the function $t\rightarrow t f(1/t)$ is also convex and hence $f(0)$ and $0f(1/0)$ cannot be $-\infty$. Define the auxiliary function $g:[0,+\infty)^2 \rightarrow (-\infty, +\infty]$ for any $(a,b)\in [0,+\infty)^2$ \begin{align*} g(a,b)= \begin{cases} 0&\text{if } a=b=0\newline b f(0)&\text{if } a=0\newline a 0f(1/0)&\text{if } b=0\newline b f(a/b)&\text{o.w.} \end{cases} \end{align*} This function is essentially $b f(a/b)$ with the limits when $a$ or $b$ are $0$.
Then for a measure $\mu$ and two $\mu$-finite non-negative function $u$ and $v$ \begin{align*} \int g(u(x),v(x)) d\mu(x) \geq g\left( \int u d\mu, \int v d\mu \right) \end{align*} This statement is provable using Jensen inequality if $\mu(\lbrace u=0 \lor v=0\rbrace) = 0$ because then $u/v$ is integrable, similarly if either $\int u d\mu=0$ (respectively $\int v d\mu = 0$) then equality is attained since then $u=0$ (respectively $v=0$) $\mu$-almost surely and so we only consider one of the cases of the definition of the $g$ function ($a=0$ or $b=0$).
In order to prove that inequality my attempt is the following. [The following is the inequality I fail to prove in a satisfying way, this leads me to doubting the correctness of my proof] for any $(a_0,b_0)\in (0,+\infty)^2$ ($0$ excluded), there exists $q\in \mathbb R$ such that for any $(a,b)\in [0,+\infty)^2$ ($0$ included) \begin{align*} b_0 g(a,b) \geq q(a b_0-a_a b) + b g(a_0,b_0) \end{align*} This comes from the sub-gradient of $f$ at point $a_0/b_0$ and then multiplying both sides with $b_0 b$.
If this is true, then taking $a=u(x)$ and $b=v(x)$ we get that \begin{align*} &&b_0 g(u(x),v(x)) &\geq q\left(v(x) a_0 - u(x) b_0\right) + v(x) g(a_0,b_0)\newline &\Rightarrow&b_0 \int g(u(x),v(x)) d\mu &\geq q(b_0a_0-a_0b_0) + b_0 g(a_0,b_0)\newline &\Rightarrow&\int g(u(x),v(x)) d\mu &\geq g(a_0,b_0)\\ \end{align*} When integrating against $\mu$.
I believe this would be a very nice proof and also provide a good control on when equality happens (I am highly interested in that point), however I don't know how to handle the limit cases when $a$ or $b$ tends to $0$.
The statement is actually easy to prove for the case where $f(a_0/b_0)<\infty$. In that case if $q$ is a sub derivative of $f$ at $a_0/b_0$, then for any $x\in(0,\infty)$ \begin{align*} b_0 f(x)\geq q(b_0 x-a_0) + b_0 f(a_0, b_0) \end{align*} If $x=a/b$ with $a\neq 0$ and $b\neq 0$, then multiplying by $b$ we find the wanted inequality. If $a=b=0$ then both sides of the equation are $0$ and so it is satisfied. If $x$ tends to $0$ then we find that $b_0 f(0)\geq q a_0 b + g(a_0,b_0)$ which is the equation for the case where $a=0$ when we multiply both sides of the equation by $b$. If we multiply the equation $b_0 f(x)\geq q(b_0 x-a_0) + b_0 f(a_0, b_0)$ by $\frac{1}{x}$ and let $x$ tends to $+\infty$, we get that $b_0 0f(1/0)\geq q(b_0-a_0/x)+1/x b_0 f(a_0, b_0) \to q b_0$ which is the wanted inequality when $b=0$.
Now the problem is when $f(a_0/b_0)=\infty$, I think that this approach cannot work but the fact that the function $f$ is convex and equal to $infty$ at $a_0/b_0$ means that it is also equal to $infty$ on the left or the right of $a_0/b_0$ and so it should be the case that the inequality $\int g(u(x),v(x)) d\mu(x) \geq g\left( \int u d\mu, \int v d\mu \right)$ is infinity on both sides. I don't know how to prove that though.