Matrix Differentiation of Least Squares Form with Hadamard Product

476 Views Asked by At

Suppose that $A, B, X \in \mathbb{R}^{n\times n}$ and $X = xx^{T}$, where $x \in \mathbb{R}^{n \times 1}$. Denote $\circ$ to be a Hadamard Product and $\| \cdot\|_{F}$ be a Frobenius norm of a matrix.

I want to find a vecotr $x$ to minimize a cost function $$\begin{align} J(X) & = \left\| A - B \circ X\right\|^{2}_{F} + \lambda \left\| X \right\|^{2}_{F}.\\ & = \left\| A - B \circ (xx^{T})\right\|^{2}_{F} + \lambda \left\| xx^{T} \right\|^{2}_{F}\\ & = \sum_{i=1}^{n} \sum_{j=1}^{n}\left(a_{ij} - b_{ij} x_i x_j \right)^2 + \lambda \sum_{i=1}^{n} \sum_{j=1}^{n} (x_i x_j)^2. \end{align}$$

The next step I think is to differentiate $J$ with respect to $x_i$ and $x_j$ respectively but I have no idea to proceed because what I want is the only one $x$ but here the problem becomes non-linear and I do now know how to do the differentiation on the entries of matrices. Thank you in advance.

1

There are 1 best solutions below

8
On BEST ANSWER

The objective function is a function of $ x $ hence the gradient is a function of the vector.

By separating the functions:

$$ J \left( x \right) = f \left( x \right) + \lambda g \left( x \right) $$

Then:

$$ \nabla f \left( x \right)_{k} = \sum_{i = k, j} -2 {b}_{ij} {x}_{j} \left( {a}_{ij} - {b}_{ij} {x}_{i} {x}_{j} \right) + \sum_{i, j = k} -2 {b}_{ij} {x}_{j} \left( {a}_{ij} - {b}_{ij} {x}_{i} {x}_{j} \right) $$

And

$$ \nabla g \left( x \right)_{k} = \sum_{i = k, j} 2 {x}_{j} \left( {x}_{i} {x}_{j} \right) + \sum_{i, j = k} 2 {x}_{i} \left( {x}_{i} {x}_{j} \right) $$

In total:

$$ \nabla J \left( x \right)_{k} = \nabla f \left( x \right)_{k} + \lambda \nabla g \left( x \right)_{k} $$

I'm pretty sure you can extract a structure in the solution in order to vectorize the gradient calculation.

You can find numerical validation in my StackExchange Mathematics Q2411200 GitHub Repository.

Update

I also used Gradient Descent to optimize the problem.
Here is the result:

enter image description here