Least squares with non-negative eigen values

59 Views Asked by At

I am trying to use least squares to solve a problem of the form $$ u=-Kv $$ where $u$ and $v$ are $3$-dimensional vectors, and $K$ is a $3\times3$ matrix. I want to estimate $K$, given $u$ and $v$. I have multiple data for $u$ and $v$, setup with the hope that the different data points are sufficiently linearly independent that a unique solution exists.

The way I have proceeded to do this is setup the problem as a traditional least-squares problem $$ Ax=b, $$ where $b$ is now all the data I have for the $3$ elements of $u$, and $A$ has all the data for $v$, and $K$ has been expanded into a vector $x$, where $x = (K_{11}, K_{12}, K_{13}, K_{21},\ldots, K_{33})$.

Due to some physical reasons I want to add an extra constrain. The constrain is that if I break the matrix K into a symmetric and antisymmetric part, $$ K=S+A, $$ then the eigenvalues of $S$ should be positive, or at least $2$ of them be positive. In particular, $S$ will have $3$ eigenvalues, $(\lambda_1,\lambda_2,\lambda_3)$. In my problem $\lambda_1,\lambda_2\gg\lambda_3$, and I would like $\lambda_1,\lambda_2>0$.

The physical reason is that $\lambda_1,\lambda_2$ are representative of diffusivities, and negative diffusivities are usually unjustifiable on physical grounds.

In my current solution, where I apply no constrains, the eigenvalues end up being positive almost everywhere (>80% times). I think the places where the values are turning out to be negative are regions where the data in $u$ and $v$ (or $A$ and $b$) is not sufficiently linearly independent (any help on a good metric to check this would also be appreciated).