A convex optimization problem

113 Views Asked by At

Assume that $R(k_1,k_2)$ is a $3 \times 3$ positive definite matrix for $k_1 = 0,1,k_2 =0,1$, $L(k_1,k_2)$ is a $3 \times 1$ vector for $k_1 = 0,1,k_2 =0,1$, $z$ is a scalar, and $\alpha_i(k) \in (0,1)$ for $i=1,2$, $k=0,1$. We want to find x and functions $f_1$ and $f_2$ that solve the following optimization problem:

\begin{align} \min_{x, f_1, f_2} \sum_{y_1 \in \{0,1\}} \sum_{y_2 \in \{0,1\}} \alpha_1(y_1) \alpha_2(y_2) \Big( \begin{bmatrix} x &f_1(y_1) & f_2(y_2) \end{bmatrix} R(y_1,y_2) \begin{bmatrix} x \\f_1(y_1) \\ f_2(y_2) \end{bmatrix} + \begin{bmatrix} x &f_1(y_1) & f_2(y_2) \end{bmatrix} L(y_1,y_2) z \Big). \end{align}

The minimizer $x$, $f_1$, and $f_2$ should be based on parameter $z$. Any idea for solving this problem?

The function we are optimizing over is the sum of four functions each of them is strictly convex. So, the sum of them is also strictly convex. If we define $s_1^0 = f_1(0)$, $s_1^1 = f_1(1)$, $s_2^0 = f_2(0)$, and $s_2^1 = f_2(1)$. Then we can write the optimization problem as optimizing over $x,s_1^0,s_1^1, s_2^0, s_2^1$ as follows which is a optimization over a strictly convex function of $x,s_1^0,s_1^1, s_2^0, s_2^1$. Right?

\begin{align} \min_{x,s_1^0,s_1^1, s_2^0, s_2^1} \Bigg[ &\alpha_1(0) \alpha_2(0) \Big( \begin{bmatrix} x &s_1^0 & s_2^0 \end{bmatrix} R(0,0) \begin{bmatrix} x \\s_1^0 \\ s_2^0 \end{bmatrix} + \begin{bmatrix} x &s_1^0 & s_2^0 \end{bmatrix} L(0,0) z \Big) \\ + &\alpha_1(0) \alpha_2(1) \Big( \begin{bmatrix} x &s_1^0 & s_2^1 \end{bmatrix} R(0,1) \begin{bmatrix} x \\s_1^0 \\ s_2^1 \end{bmatrix} + \begin{bmatrix} x &s_1^0 & s_2^1 \end{bmatrix} L(0,1) z \Big) \\ + &\alpha_1(1) \alpha_2(0) \Big( \begin{bmatrix} x &s_1^1 & s_2^0 \end{bmatrix} R(1,0) \begin{bmatrix} x \\s_1^1 \\ s_2^0 \end{bmatrix} + \begin{bmatrix} x &s_1^1 & s_2^0 \end{bmatrix} L(1,0) z \Big) \\ + &\alpha_1(1) \alpha_2(1) \Big( \begin{bmatrix} x &s_1^1 & s_2^1 \end{bmatrix} R(1,1) \begin{bmatrix} x \\s_1^1 \\ s_2^1 \end{bmatrix} + \begin{bmatrix} x &s_1^1 & s_2^1 \end{bmatrix} L(1,1) z \Big) \Bigg] \end{align}

This optimization can be written as

\begin{align} \min_{x,s_1^0,s_1^1, s_2^0, s_2^1} \Big( \begin{bmatrix} x &s_1^0 &s_1^1 &s_2^0 &s_2^1 \end{bmatrix} Q \begin{bmatrix} x \\ s_1^0 \\ s_1^1 \\ s_2^0 \\ s_2^1 \end{bmatrix} + \begin{bmatrix} x &s_1^0 &s_1^1 &s_2^0 &s_2^1 \end{bmatrix} P z \Big) \end{align} where $Q$ is a $5 \times 5$ matrix and $P$ is a $5 \times 1$ vector. Since the second optimizing optimization problem is a strictly convex function of $x,s_1^0,s_1^1, s_2^0, s_2^1$, this one should also be. Right? Does this mean that $Q$ is PD and invertible?