Constraint optimization problem with an unusual constraint

74 Views Asked by At

I struggle with solving the following minimization problem:

$F=||X-(A+B)C^T||_F^2$

s.t. $A⊙B=0$

where ⊙ is Hadamard product. My solution is as follows:

First, getting Lagrangian form of $F$:

$L=||X-(A+B)C^T||_F^2 + Tr(\Lambda(A⊙B)^T)$

where $\Lambda$ is a matrix containing Lagrange multipliers.

Next, calculating the derivatives of $A,B, \text{and }C$ as follows:

$\frac{dL}{dC}=-2X^TA - 2X^TB + 2CA^TA + 2CB^TB + 2CA^TB + 2CB^TA$

$\frac{dL}{dA}=-2XC + 2AC^TC + 2BC^TC + \Lambda⊙ B$

$\frac{dL}{dB}=-2XC + 2BC^TC + 2AC^TC + \Lambda⊙ A$

$\frac{dL}{d|\Lambda}=A⊙B$

Finally, I used the gradient descent to solve the optimization problem.

However, the problem is that the values of $A,B, \text{and } \Lambda$ become very large after few iterations.all values of $\Lambda(A⊙B)^T$ in $L$ becomes non-positives since of the corresponding elements of $A,B, \text{or} \Lambda$ is negative. My point is $Tr(\Lambda(A⊙B)^T)$ keeps decreasing and $||X-(A+B)C^T||_F^2$ keeps increasing in such a way that $L$ decreases.

I really appreciate any help.