I'm performing an optimization where for a first order condition I wish to solve an equation in square, say $n\times n$, matrices of the following form:
$$ I \circ [\Lambda - R^{-1}\Lambda^{-1}S] = 0, $$
where $\circ$ denotes the elementwise product, i.e. I need to set the diagonal of the expression in brackets to zero. Assume $\Lambda$ is a diagonal matrix, $R$ is positive definite and $S$ is positive semi-definite (known to be of less than full rank).
I tried iteratively for $\Lambda^0 = I$ and $k =1, 2,\dots$ solving $\Lambda^{k+1} = I \circ (R^{-1}[\Lambda^{k}]^{-1}S)$ but did not get convergence.
I wonder if there is a closed form solution, or if there is a theorem about the convergence of such an iterative scheme that I can check?
I assume that the unknown is the matrix $\Lambda$. Let $\Lambda=diag(1/a_1,\cdots,1/a_n)$ where $a_i\not= 0$, $R^{-1}=[r_{i,j}],S=[s_{i,j}]$. The problem reduces to the matricial equation:
$[1/a_1,\cdots,1/a_n]^T=\sum_j r_{i,j}a_js_{j,i}=U[a_1,\cdots,a_n]^T$ where $U=[u_{i,j}]$, with $u_{i,j}=r_{i,j}s_{j,i}$ that is, $U=R^{-1}\circ S$ a symmetric $\geq 0$ matrix. There are $2^n$ solutions in $\mathbb{C}^n$; I have an example with $n=3$ which admits $8$ solutions in $\mathbb{R}^3$.
Numerically. Let $X_k=[x_{k,1},\cdots,x_{k,n}]^T,Y_k=[1/x_{k,1},\cdots,1/x_{k,n}]^T$. By the Newton's method:
$X_{k+1}=X_k-{Z_k}^{-1}(UX_k-Y_k)$ where $Z_k=U+diag({x_{k,1}}^{-2},\cdots,{x_{k,n}}^{-2})$.