I am trying to understand partial minimization of jointly convex function in one variable.
The theorem which I am facing difficulties to understand is expressed as follows.
Let $f: R^n \times R^m \to (-\inf,+\infty] $ be a jointly convex function. Then the function $g(x) =$ inf$_{y\in R^m}f(x,y)$ , $x \in R^m$ is convex
Could you please help me to understand the theorem through,
- What is the intuition of jointly convex function? Appreciate if you can explain it using $f(x,y) = x^2 + y^2$
- What is the intuition of inf$_{y\in R^m}f(x,y)$. My understanding is the maximum value of the lower bound of the function $f$ when we vary $y$ with fixed $x$ value. Is it correct?
Appreciate your insight.
That function is jointly convex because its Hessian is positive definite.
In contrast, the function $g(x,y)=x\cdot y$ is only separately convex, i.e., $x\mapsto x\cdot y$ and $y\mapsto x\cdot y$ are convex (even linear), no matter what the fixed parameter is. On the other hand, $g$ is not jointly convex: compute the Hessian matrix and discover it is not positive semidefinite.
Yes.