Are there practical problems in optimization where its dual problem is computationally easier to solve?

40 Views Asked by At

In a class I learned that optimization problems can have "dual counterparts"

For example, a problem of the type

$$\min_{x} f(x) + g(x)$$

has a "Fenchel" dual problem:

$$\max_{y} -f^\ast(y) - g^\ast(-y)$$

As another example:

$$\min_{x} f_0(x) \text{ s.t. } f_i(x) \leq 0$$

has a "Lagrange" dual problem:

$$\max_{\lambda \geq 0} L(x, \lambda)$$ where $L$ is the Lagrangian.

There is some relationship of course between $x$ and $y$ or $\lambda$, which makes it possible to recover the optimizer.

My question is: are there practical problems in optimization where duality is useful because it makes the original problem easier to solve?

I have seen some very contrived toy examples (for example, $\min x^3$ on x$ \in [3, 5]$) where it seems the dual problem is much harder to solve than the original problem.