Nontrivial lower-bound for $\inf_{x \in \Delta_n} \|Gx\|$

85 Views Asked by At

Let $G$ be and $m \times n$ matrix of full-rank $n \le m$ in particular, and let $\Delta_n := \{x \in \mathbb R^n \mid x_1,\ldots,x_n \ge 0,\;\sum_{i=1}^n x_i = 1\}$ be the $(n-1)$-dimensional unit simplex. Define $s(G)$ by $$ s(G) := \inf_{x \in \Delta_n} \|Gx\|. $$

Question. Is there a nontrivial lower-bound for $s(G)$ in terms of simpler quantities (say, it terms of a function of the singular-values of $G$, etc.) ?

For a trivial bound, note that $$ s(G) \ge \inf_{x \in \Delta_n} s_{\min}(G)\|\cdot\|x\| = s_{\min}(G) \cdot \inf_{x \in \Delta_n} \|x\| = s_{\min}G)/\sqrt{n}, $$

where $s_{\min}(G) := \inf_{\|x\|=1} \|Gx\|$ is the least singular-value of $G$.

2

There are 2 best solutions below

0
On BEST ANSWER

You can get a lower bound by omitting the nonnegativity constraint. The problem becomes: $$\min_{x \in \Delta_n} \{ x^T G^TG x : e^Tx = 1 \}.$$ Note that this is a convex optimization problem that satisfies the Slater condition, so the KKT conditions are necessary and sufficient. The Lagrangian is $L(x,y) = x^T G^TG x + y(e^Tx - 1)$, and setting its derivatives to $0$ results in the following KKT system: $$\begin{pmatrix}2G^TG & e \\ e^T & 0\end{pmatrix} \begin{pmatrix}x \\ y \end{pmatrix} = \begin{pmatrix} 0 \\ 1 \end{pmatrix}.$$ Solving for $x$ and computing $||Gx||$ gives a lower bound.

Instead of solving via the Lagrangian, you can also eliminate the last element of $x$ by writing it as $x_n = 1-e^T\tilde{x}$, which gives an unconstrained quadratic objective function.

2
On

This is a follow-up on LinAlg's answer (accepted!).

Claim. The bound in the accepted answer is always better than my trivial bound $s(G) \ge s_\min(G)/\sqrt{n}$

Proof. As carefully observed in the accepted answer, we can drop the nonnegativity constraint and get a lower-bound, namely $s(G)^2 \ge \min \{x^TG^TGx \mid e^Tx = 1\}$. Now, let $R \succ 0$ a square-root of $G^TG$. Thus $R$ is a positive-definite matrix such that $G^TG = R^2$. Consider the change of variable $z = Rx$. We then get

$$ \begin{split} s(G) &\ge \min \{\sqrt{x^TG^TGx} \mid e^Tx = 1\} = \min\{\|z\| \mid (R^{-1}e)^Tz = 1\}\\ &= \text{dist}(0,\{z \in \mathbb R^n \mid (R^{-1}e)^Tz = 1\}) = \frac{1}{\|R^{-1}e\|} = \frac{1}{\sqrt{e^T(G^TG)^{-1}e}}, \end{split} $$

That is, $s(G) \ge \dfrac{1}{\sqrt{e^T(G^TG)^{-1}e}}$. To see that this is a better lower-bound, note that

$$ \frac{1}{\sqrt{e^T(G^TG)^{-1}e}} \ge \frac{1}{\sqrt{\lambda_{\max}((G^TG)^{-1})}\|e\|} = \frac{\sqrt{\lambda_\min(G^TG)}}{\|e\|} = \frac{s_\min(G)}{\sqrt{n}}. $$