Chance-Constrained Optimization

96 Views Asked by At

Let $x \in \mathbb{R}^n$ and $\xi \in \Xi \subset \mathbb{R}^n$ be a random vector with discrete probability distribution $p(\cdot)$, assuming a finite number of values $\{\xi_i\}_{i = 1}^S$. Let $A \in\mathbb{R}^{n\times n}$ be a Positive Semi-definite matrix.

Formulate the following chance constrained optimization:

$$\inf_{x}x^TAx\\ \text{ s.t. } \mathbb{P}[\xi^\top x>0]>1-\delta $$

with $\delta\in[0,1]$.

What is the correct way to solve this problem? Does a closed-form solution exist?

What I tried :

Note that the constraint set can be rewritten as

$$ \sum_{i=1}^S\mathbb{1}\{\xi_i^\top x>0\}p(\xi_i) > 1-\delta. $$

The constraint set can be decomposed in $2^S$ disjoint sets. Based on the $p(\xi_i)$ only some of these sets will satisfy the above inequality. Thus I tried to solve each of the cases and took the minimum of these solutions.

Is it correct? I suspect that this is not the best way to proceed and there should be a way to transform the problem into a Mixed Linear programming problem, but I don't know how to do that exactly.

Any help would be greatly appreciated. Thanks!

1

There are 1 best solutions below

2
On BEST ANSWER

You can introduce a binary variable $y_i$ for each scenario $S$ indicating if $\zeta_i^Tx > 0$. Let $\varepsilon$ be the smallest positive value for which $P(\zeta_i^Tx > 0) = P(\zeta_i^Tx \geq \varepsilon)$, and let $y_i \in \{0,1\}$ be a binary variable, then the constraint: $$\zeta_i^Tx \geq \varepsilon - (1-y_i)M $$ enforces that $y_i$ can take the value $1$ only if $\zeta_i^Tx \geq \varepsilon$. So if you combine that constraint with $\sum_i p_i y_i \geq 1-\delta$ (or with a small number added to the right hand side to get a strict inequality), you can model your constraint.