"Invert" a positive definite matrix in an inequality involving absolute value

51 Views Asked by At

$\newcommand\norm[1]{\lVert#1\rVert_\infty}$Suppose $A$ is a positive definite matrix, and $b$ is a strictly positive vector. (Actually, in our case, $b$ comprises square roots of the diagonal of $A$, i.e., $b_i=\sqrt{a_{ii}}$.)

We aim to describe solutions of the equation $$ \operatorname{abs} Ax\leq b,\quad\text{i.e.,}\quad \Bigl\lvert \sum_j a_{ij} x_j\Bigr\rvert \leq b_i\; \forall i.$$

Obviously, this is equivalent to $-b\leq Ax\leq b$. However, I don't see a way how to apply $A^{-1}$ in such an inequality. Is there any way how to solve this equation?

Note that in the end, we will search for solutions lying on a certain lattice, so even ways that involve quite heavy computation are fine to us.


The best I've been able to obtain so far is the following necessary condition. We know that $\bigl\lvert \sum_j a_{ij} x_j\bigr\rvert \leq b_i\; \forall i$, hence also $\max_i \bigl\lvert \sum_j a_{ij} x_j\bigr\rvert \leq \max_i b_i$, i.e., $\norm{Ax}\leq\norm{b}$. From this we have that $$ \norm{x} = \norm{A^{-1}Ax} \leq \norm{A^{-1}}\norm{Ax} \leq\norm{A^{-1}}\norm{b}. $$ This gives an upper bound on the coordinates of $x$, however, this works badly if the matrix $A$ is ill-conditioned.