I'm looking more for hints than specific answers, although I would be extremely grateful if provided with one.
The problem I have is as follows: $$ -\Sigma (A+\Lambda_1)+I=0 $$
Here A is a constant, positive definite, symmetric matrix and $\Lambda_1$ is a diagonal positive matrix where ${\Lambda_1}_{ii}=\exp\left(-\dfrac 1 2 \dfrac{\mu_i}{1+\Sigma_{ii}}\right)$ where $\mu_i$ is a known constant.
If it helps, assume that I can linearise $\Lambda\approx A_2+A_3\Sigma_D$ where the A's are again constants (but diagonals) and $\Sigma_D$ is the diagonal of the matrix $\Sigma$.
My first inclination was to solve this iteratively. Start off with $\Sigma=A^{-1}$ and then take the diagonal values into $\Lambda_1$ and update $\Sigma=(A+\Lambda_\text{new})^{-1}$ and keep iterating on this.
Is there a better method to do this? Newton's method? Will my iterative method converge? Any other ideas altogether?