Is there any relation between the Hadarmad product and conventional matrix product? I'm trying to solve an optimization problem and I'm kind of stuck here. Let X, A, B, C are square matrices with same dimensions. A,B,C are known. How can I find X if :
$X\circ A + X.B = C$
I'm very appreciate for your help
Find the SVD of the first coefficient matrix $$A = USV^T = \sum_k \sigma_ku_kv_k^T$$ where $(u_k, v_k)$ are the $k^{th}$ columns of $(U,V)$ respectively, and $\sigma_k$ is the $k^{th}$ element on the diagonal of $S$.
From this decomposition generate two sets of diagonal matrices $$\eqalign{ E_k &= {\rm Diag}(u_k) \cr F_k &= {\rm Diag}(v_k) \cr }$$ The elementwise/Hadamard product can now be expressed as a sum $$A\circ X = \sum_k \sigma_kE_kXF_k$$ We are now in a position to vectorize the original equation $$\eqalign{ &IXB + \sum_k \sigma_kE_kXF_k = C \cr &\Big(B^T\otimes I + \sum_k \sigma_kF_k\otimes E_k\Big)\,{\rm vec}(X) = {\rm vec}(C) \cr &{\rm vec}(X) = \Big(B^T\otimes I + \sum_k \sigma_kF_k\otimes E_k\Big)^+\,{\rm vec}(C) \cr }$$ where $M^+$ indicates the pseudoinverse of $M$.
De-vectorize this result, if you want a matrix solution.
Update
Some Julia code to test this solution method
Update 2
By using sparse versions of the operators [notably speye() and spdiagm()] in the Kronecker step, Julia can solve moderately large systems with this method.
Update 3
There is a way to solve the problem without calculating large Kronecker products.
If you multiply a Hadamard product (from the left or right) by a vector from the standard basis, it distributes over the terms in the product like so $$\eqalign{ e_k^T(A\circ B) &= (e_k^TA)\circ(e_k^TB) \cr (A\circ B)e_k &= (Ae_k)\circ(Be_k) \cr }$$ We can use this to solve for the $X$ matrix, one row at a time $$\eqalign{ e_k^T(X\circ A) + e_k^T(XB) &= e_k^TC \cr (e_k^TX)\circ(e_k^TA) + (e_k^TX)B &= e_k^TC \cr x_k^T\Big({\rm Diag}(a_k^T)+B\Big) &= c_k^T \cr x_k^T &= c_k^T\Big({\rm Diag}(a_k^T)+B\Big)^+ \cr }$$ The Julia code is simpler as well.