I will keep it brief. I have a minimization problem that I think might have an explicit solution. Ideally I would like to solve \begin{equation} \begin{bmatrix}1&a&a&\dots&a\\a&1&b&\dots&b\\a&b&1&\dots&c&\\ \vdots&\vdots&\vdots&\ddots&\vdots\\ a&b&c&\dots&1\end{bmatrix}\Sigma=\begin{bmatrix}0\\0\\0\\\vdots\\0\end{bmatrix} \end{equation} subject to \begin{equation} a,b,c,\dots>0 \end{equation} Where $\Sigma$ is a symmetric, invertible, positive on the diagonal and negative on the off diagonals.
Is there some way to get an explicit regression for $a,b,c,\dots$ that will minimize the product?
My current method of 'solving' this is to just solve for $a$ then solve for $b$, etc. However, as one might expect there is no solution to the problem (given the constraints) so I have a left a few of the equations unsolved (about 3 out of 291). I put the solution I got from that into a no-derivative minimization procedure to tighten it up a bit, but it is incredibly slow.
Thank you!