Matrix calculus for dot product

94 Views Asked by At

I'm trying to maximize for below objective function for vector $X$.

$S \cdot X - \|BX - B_0\|_2 - \|X\|_2 $

Where $S \cdot X$ is a dot product between vector $S$ (known values) and vector X. $B$ is a matrix.

So obviously $\|BX - B_0\|_2 + \|X\|_2$ part can be differentiated. Can we differentiate the $S \cdot X$ part so I can hopefully get a closed-form solution (that will make my life easier than using an optimizer)? Or is there a way to formulate the problem in other forms?

[EDIT] Thanks all the comments. I got it now. But if I want to enforce some sparsity, to make the function

$S \cdot X - \|BX - B_0\|_2 - \|X\| $, does it have a closed-form solution?

1

There are 1 best solutions below

0
On BEST ANSWER

Let's use a naming convention where uppercase letters represent matrices, lowercase letters represent vectors, and Greek letters are scalars.

The elementwise sign function $\,g = {\rm sign}(x)\,$ can be used to expand the ${\tt1}$-norm.

The constituent functions and their derivatives are $$\eqalign{ \alpha &= s^Tx \qquad&\partial_x\alpha = s \\ \beta &= \|Bx-b_0\|_2^2 \qquad&\partial_x\beta = 2B^T(Bx-b_0) \\ \lambda &= \|x\|_{1} = g^Tx \qquad&\partial_x\lambda = g \\ }$$ Put all of the pieces together to create the full function. $$\eqalign{ \phi &= \alpha - \beta - \lambda \\ \partial_x\phi &= \partial_x\alpha - \partial_x\beta - \partial_x\lambda \\ &= s - 2B^TBx + 2B^Tb_0 - g \\ }$$ Set the derivative to zero and solve for $x$. $$\eqalign{ B^TBx &= B^Tb_0 +\tfrac 12(s-g) \\ x &= (B^TB)^{-1}\left(B^Tb_0 +\tfrac 12(s-g)\right) \\ }$$ There are two potential problems with this closed-form solution:

  • $\|Bx-b_0\|_2\,$ was changed to $\,\|Bx-b_0\|_2^2$
  • $\,\partial_x\lambda\,$ does not exist when any component of $x$ equals zero