Minimizing a quadratic function with constraints on some variables

643 Views Asked by At

Consider a problem with strictly convex quadratic objective with some of the unconstrained variables.

minimize: $x_1^TP_{11}x_1 + 2x_1^TP_{12}x_2 + x_2^TP_{22}x_2$

subject to: $f_i(x_1) \leq0, i = 1, \dots , m.$

Here we can analytically minimize over $x_2$:

$\inf \ (x_1^TP_{11}x_1 + 2x_1^TP_{12}x_2 + x_2^TP_{22}x_2) = x_1^T(P_{11} - P_{12}P_{22}^{-1})x_1\,.$

Therefore the original problem is equivalent to:

minimize: $x_1^T(P_{11} - P_{12}P_{22}^{-1})x_1$

subject to: $f_i(x_1) \leq0, i = 1, \dots , m.$

This is an example from my book, But I don't understand it. To be specific I don't know how the $\inf \ (x_1^TP_{11}x_1 + 2x_1^TP_{12}x_2 + x_2^TP_{22}x_2) = x_1^T(P_{11} - P_{12}P_{22}^{-1})x_1$ works.

1

There are 1 best solutions below

0
On BEST ANSWER

If the vector variable is $x_2,$ and we have a constant column vector $w,$ then $w^T x_2$ is a scalar function, as a row its gradient is $w^T,$ as a column its gradient is $w.$ The relevant term above is $2 x_1^T P_{12} x_2,$ so $w^T = 2 x_1^T P_{12}.$ We need to assume that $P_{12}$ is symmetric to get the (column) gradient as $2 P_{12} x_1.$

The gradient of $x_2^T P_{22} x_2$ is $2 P_{22} x_2$ as a column. We need to assume $P_{22}$ is symmetric positive definite to even be sure of a minimum.

As a function of $x_2,$ the gradient (as a column) of your quadratic is $$ 2 P_{12} x_1 + 2 P_{22} x_2 \; ; $$ set this to the zero vector. If $P_{22}$ is symmetric positive definite it has an inverse, and we can solve for $x_2$ and plug that back in.