Deriving bounds on a parameter for the following determinant inequality

48 Views Asked by At

I would like to derive the least restrictive bounds on the parameter $\gamma \in \mathbb{R}$ to satisfy the following expression:

$$ \det \big(I + A(\gamma)\big) \leq 1, $$

where

$$A (\mu) = (e e^T) \big( x x^T - (x^T x) \cdot I \big) + \gamma \cdot (x x^T) \big( e e^T - (e^T e) \cdot I \big)$$

with $I$ being the identity matrix of appropriate size, $(x,e) \in \mathbb{R}^{n \times 1}$ are each column vectors; thus, there are rank one matrices outside of the above expression and scalars sitting with $I$ inside.

1

There are 1 best solutions below

3
On BEST ANSWER

Assuming $\mu = \gamma$ in the second equation of the problem statement, the limits on $\gamma$ depend on the value of $$\sigma = {\left| {\underline {\bf{e}} } \right|^2}{\left| {\underline {\bf{x}} } \right|^2} - {\left( {{{\underline {\bf{e}} }^T}\underline {\bf{x}} } \right)^2} - 1$$

If $\sigma > 0$, then $\gamma \ge \frac{1}{\sigma }$.

If $\sigma < 0$, then $\gamma \le \frac{1}{\sigma }$.

If $\sigma = 0$, then the value of the determinant is zero independent of the value of $\gamma$.

See here for a detailed derivation. I have verified this result numerically.