Global maximum of Lagrangian vs global maximum of unconstrained function

179 Views Asked by At

Let's assume that $x$ is a matrix of parameters for the quadratic function $y=f(x)$ and $y$ is a scalar. Let's also assume that $x^{*}$ is such that $y^{*}=f(x^{*})$ is a global minimum of $f(x)$ (as f is convex). Let's now consider the constrained minimization problem $$min_{x} \quad f(x) \quad s.t. cond_1,..., cond_n$$ where we minimize the same function subject to a set of $n$ quadratic equality conditions, via the Lagrangian $L(x,\lambda_{1},...,\lambda_{n})=f(x)-\sum_{i}\lambda_{i}cond_{i}$

Let's assume this constrained minimization has multiple solutions (i.e. we find several stationary points). We want to test if one of the several solutions to the above mentioned constrained problem, denoted by $x^{sol}$ corresponds to a global minimum for the Lagrangian $L(x^{sol}, \lambda^{sol}_{1},..., \lambda^{sol}_{n})$ (where $\lambda^{sol}_{1},..., \lambda^{sol}_{n}$ are the lambdas corresponding to $x^{sol}$ found by solving the system of the partial derivatives with respect to $x$ and $\lambda_{1},..., \lambda_{n}$ equated to 0 as usual!).

My question is: in order to prove that $L(x_{sol}, \lambda_{sol})$ is a global minimum for L, is it sufficient to prove that $x^{sol}=x^{*}$, so that $f(x^{sol})=f(x^{*})$ is a global minimum for the unconstrained function?

In other words, given a global minimum $x^{*}$for $f(x)$, if $x^{*}$ satisfies the n conditions $cond_1,..., cond_n$, is it automatically a global minimum for the Lagrangian $L(x,\lambda_{1},...,\lambda_{n})$ without any need for further formal proofs?