Conjugate for negative generalized logarithm, can't solve gradient equations

447 Views Asked by At

I was working on some practice problems for Stephen Boyd's Convex Optimization on my own time, and I couldn't understand a certain step in the solution to a problem (3.36f). The question essentially asked for

$$\sup_{x,t}\{y^Tx+st+\log(t^2-x^Tx)\; | \; x\in \mathbb{R}^n, t\in \mathbb{R}, ||x||_2<t\}$$

Where $y \in \mathbb{R}^n, s \in \mathbb{R}$ are fixed values. This is solved by finding the value of $t$ and $x$ at a critical point in terms of $y$ and $s$, but when I compute the gradient and set it equal to 0, I can't a find a way to solve for those two variables:

$$y + \frac{-2x}{t^2-x^Tx} = 0$$ $$s + \frac{2t}{t^2-x^Tx} = 0$$

The book quickly claims the solution is:

$$x = \frac{2y}{s^2-y^Ty}$$ $$t = \frac{-2s}{s^2-y^Ty}$$

But even if $x$ were a scalar I don't see a clear path toward this solution; maybe if I multiplied the equations by $(t^2 - x^2)$ and used the quadratic formula everything will cancel out, but this doesn't seem to generalize to the vector versions of the equations since there's no notion of square roots or division for vectors. Is there an obvious alternative approach that I'm missing here?

1

There are 1 best solutions below

1
On

If you combine the equations (eliminate the ${ 1\over t^2 - \|x\|^2}$ term) you get $ty+sx=0$, that is, x,y are collinear.

This gives $t^2 \|y\|^2 = s^2 \|x\|^2$ and a little manipulation gives the last equation above.

Finally, using $x = -{t y \over s}$, the last equation gives the second last equation.