Upper bound for a pair of positive semi-definite matrices

577 Views Asked by At

I have a pair of positive semi-definite matrices $\mathbf{A}_1$ and $\mathbf{A}_2$ and would like to compute a diagonal matrix $\mathbf{S}$ such that $\mathbf{x}^T \mathbf{S} \mathbf{x} \geq \mathbf{x}^T \mathbf{A}_1 \mathbf{x}$ and $\mathbf{x}^T \mathbf{S} \mathbf{x} \geq \mathbf{x}^T \mathbf{A}_2 \mathbf{x}$ for all $\mathbf{x}$. $\mathbf{A}_1$ and $\mathbf{A}_2$ are already close to being diagonal matrices themselves but not quite.

This problem can be visualized as a pair of ellipsoidal scalar fields that are close to being aligned to the coordinate axes and need to be bounded by an ellipsoidal scalar field that is aligned to the coordinate axes. Since the matrices are positive semi-definite the scalar fields could be planar or cylindrical as well.

Any help appreciated.

Edit #1: I need the smallest $\mathbf{S}$ that bounds the two matrices. Allow me to simplify the problem. It suffices to solve this for a single matrix. So what I'm looking for is the smallest diagonal matrix $\mathbf{D}$ that bounds a positive semi-definite matrix $\mathbf{A}$. Thus, $\mathbf{x}^T \mathbf{D} \mathbf{x} \geq \mathbf{x}^T \mathbf{A} \mathbf{x}$ for all $\mathbf{x}$. Then, for $\mathbf{D}_1$, $\mathbf{D}_2$, bounding resp. $\mathbf{A}_1$, $\mathbf{A}_2$, I can simply take the component-wise maximum of $\mathbf{D}_1$ and $\mathbf{D}_2$ as $\mathbf{S}$.

Edit #2: I forgot to mention that I'm looking for a solution for 3x3 matrices. In order for $\mathbf{D}$ to be the smallest bounding diagonal matrix that bounds $\mathbf{A}$, we need to impose that $\mathbf{D} - \mathbf{A}$ be the smallest possible positive semi-definite matrix. This would mean that $\mathbf{M} = \mathbf{D} - \mathbf{A}$ would have $P(\lambda) = \lambda^3$ as characteristic polynomial, since that would result in having only zero as eigenvalue, correct? If we are able to solve $\mathbf{D}$ s.t. $Tr(\mathbf{M}) = 0$, $Tr(\mathbf{M}^2) - Tr(\mathbf{M})^2 = Tr(\mathbf{M}^2) = 0$, and $det(\mathbf{M}) = 0$, then we get the minimal characteristic polynomial. We have a linear, a quadratic, and a cubic equation and three unknown variables, so this should be solvable.

Edit #3: Actually, it could very well be that solving for characteristic polynomial $P(\lambda) = \lambda^3$ only has a solution if $\mathbf{A}$ is itself a diagonal matrix, which is not exactly what I'm shooting for.

Edit #4: If I'm able to do a spectral decomposition on $\mathbf{D} - \mathbf{A} = \mathbf{U} \mathbf{\Lambda} \mathbf{U}^T$, then I would simply be able to solve for a $\mathbf{D}$ that minimizes $\mathbf{\Lambda}$ s.t. $\mathbf{\Lambda} \geq 0$.

2

There are 2 best solutions below

1
On

It's a very special and simple case of semidefinite programming.

With the MATLAB Toolbox YALMIP and a semidefinite programming solver (Mosek, sdpt3, sedumi etc), you would solve it by

A1 = randn(2);A1 = A1*A1';
A2 = randn(2);A2 = A2*A2';
S = diag(sdpvar(2,1));
Model = [S >= A1, S >= A2];
objective = trace(S);
optimize(Model,objective)
x = sdpvar(2,1);
clf
plot(x'*A1*x <= 1)
hold on
plot(x'*A2*x <= 1,[],[],[],sdpsettings('plot.shade',.5))
plot(x'*value(S)*x <= 1,[],'yellow',[],sdpsettings('plot.shade',.5))

Not too unlikely that there is an explicit analytical solution through some eigenvalue tricks etc (and there are most likely good approximations or even explicit solutions in the book Ellipsoidal calculus for estimation and control by Kurzhanski

0
On

After some help from Andre Gaschler (Thanks Andre) I found a solution that yields a tight upper bound that might be the smallest bounding diagonal matrix. Either way, the upper bound is tight enough for my purpose.

Since the sum of two positive semidefinite matrices is itself a positive semidefinite matrix that bounds both of its terms, it is possible to eliminate off-diagonal elements by adding (principal minor) matrices of the form $\begin{pmatrix} 1 & -1 \\ -1 & 1 \end{pmatrix}$ or $\begin{pmatrix} -1 & -1 \\ -1 & -1 \end{pmatrix}$ times the off-diagonal element, depending on whether the off-diagonal element is resp. positive or negative.

After eliminating all three off-diagonal elements in this way, we get a diagonal matrix $\mathbf{D} = \mathrm{diag}(d_i)$ where $d_i = |a_{i1}| + |a_{i2}| + |a_{i3}|$. Phrased differently, each diagonal element is taken to be the Manhattan length of the corresponding column (or row). Note that $a_{ii}$ should already be non-negative if $\mathbf{A}$ is positive semidefinite.

$\mathbf{D}$ should be a tight upper bound of $\mathbf{A}$ especially when $\mathbf{A}$ is already close to being diagonal. I haven't got a proof for $\mathbf{D}$ being the smallest diagonal matrix that bounds $\mathbf{A}$. Perhaps, someone can provide a proof or a counter example? Thanks for your input.