Minimizing the norm subject to a quadratic form

288 Views Asked by At

I am stuck with the following minimization problem

$$\min_{x \in \mathbb{R}^n} \underbrace{\|x\|_2^2}_{=: f (x)} \quad \text{subject to} \quad \underbrace{x'Qx - 1}_{=: h (x)} = 0$$

where $Q$ is a positive definite matrix.

I can prove global solution exists by coerciveness of $f$. I then try Lagrange but I get that $\lambda = -Q^{-1}$, which does not seem useful at all. Any suggestions or resources I could check?


Note: $x'$ is the transpose of $x$.

3

There are 3 best solutions below

2
On BEST ANSWER

One can assume that $Q$ is symmetric, if not, then replace it with $\frac{1}{2} ( Q + Q ^T ) $.

$x$ is a point on the hyperellipsoid $x^T Q x = 1$

Since $ \lambda_\text{min}({Q}) (x^T x) \le x^T Q x \le \lambda_\text{max}({Q}) (x^T x) $, then

$ \dfrac{1}{\lambda_\text{max}({Q})} \le x^T x \le \dfrac{1}{\lambda_\text{min}({Q})} $

Thus the minimum of $x^T x $ is $\dfrac{1}{\lambda_\text{max}({Q})}$ and it occurs at the eigenvector corresponding to $\lambda_\text{max}({Q})$.

1
On

$$\mathcal L ({\bf x}, \mu) := \| {\bf x} \|_2^2 + \mu \left( {\bf x}^\top {\bf Q} \, {\bf x} - 1 \right)$$

Differentiating this Lagrangian with respect to ${\bf x}$ and $\mu$ and finding where $\nabla_{\bf x} \mathcal L$ and $\partial_{\mu} \mathcal L$ vanish, eventually, we obtain

$$\begin{aligned} \left( {\bf I}_n + \mu {\bf Q} \right) {\bf x} &= {\bf 0}_n \\ {\bf x}^\top {\bf Q} \, {\bf x} &= 1 \end{aligned}$$

Since ${\bf x} = {\bf 0}_n$ is neither interesting nor admissible, we want $g (\mu) := \det \left( {\bf I}_n + \mu {\bf Q} \right)$ to vanish. Note that $g$ is a degree-$n$ polynomial function, but not the characteristic polynomial of $\bf Q$. Hence, the roots of $g$ are not the eigenvalues of $\bf Q$. Let us call them -values. Finding the $n$ -values and the corresponding -vectors scaled in order to satisfy the equality constraint ${\bf x}^\top {\bf Q} \, {\bf x} = 1$, we then have only $n$ candidate minimizers and the one with the smallest $2$-norm wins.

0
On

This is a slight rephrasing of @GeometryLover's answer, so I'll make it Community Wiki.

I'll assume $Q$ is symmetric positive definite. Let $\lambda_\max$ be the largest eigenvalue of $Q$. For any vector $x \in \mathbb R^n$ which satisfies $x^T Q x = 1$, we have $$ \lambda_\max x^T x \geq x^T Q x = 1 \quad \text{which implies that}\quad x^T x \geq \frac{1}{\lambda_\max}. $$

Moreover, if $x$ is an eigenvector of $Q$ corresponding to the eigenvalue $\lambda_\max$, and $x$ is scaled so that $x^T Q x = 1$, then $$ x^T x = \frac{1}{\lambda_\max}. $$ Thus, the minimum value for your optimization problem is $1/\lambda_\max$.