Show $\sum\limits_{j=1}^{R}x_{j} = \operatorname{max} \{ x^{T}z: 0 \leq z \leq 1, \mathbb{1}^{T}z = R\}$

32 Views Asked by At

Let $x=(x_{1},...,x_{n})\in \mathbb R^{n}$ so that $x_{1} \geq x_{2} \geq ... \geq x_{n}$ and further $R \in \{1,...,n\}$.

Show that $\sum\limits_{j=1}^{R}x_{j} = \operatorname{max} \{ x^{T}z: 0 \leq z \leq 1, \mathbb{1}^{T}z = R\}$

Initially I thought of induction, but this would not make sense since $R$ is finite.

Other idea:

consider the LP:

$\operatorname{max} x^{T}z$

s.t. $z \leq 1$, $-z \leq 0$ and $\mathbb{1}^{T}z \leq R$ and $-\mathbb{1}^{T}z \leq -R$

We consider the dual

$ \operatorname{max}-(1 ,0,R,-R)\begin{pmatrix} y_{1} \\ y_{2} \\\lambda_{1} \\ \lambda_{2} \end{pmatrix}$

s.t. $(E_{n}, -E_{n}, 1, -1) \begin{pmatrix} y_{1} \\ y_{2} \\\lambda_{1} \\ \lambda_{2} \end{pmatrix}=\begin{pmatrix} x_{1} \\ x_{2} \\.. \\ x_{n} \end{pmatrix}$ and $\begin{pmatrix} y_{1} \\ y_{2} \\\lambda_{1} \\ \lambda_{2} \end{pmatrix}\geq 0$

We can thus identify $y=y_{1}-y_{2}$ with an arbitrary vector and $\lambda=\lambda^{1}-\lambda^{2}$ as an arbitrary scalar so we want to solve

$y +\lambda 1 = x$

Any ideas on how to proceed?

1

There are 1 best solutions below

0
On BEST ANSWER

The primal LP problem is to maximize $\sum_j x_j z_j$ subject to \begin{align} z_j &\le 1 && (\text{dual $\alpha_j \ge 0$})\\ \sum_j z_j &= R && (\text{dual $\beta$ free})\\ z_j &\ge 0 \end{align} Taking $$z_j=\begin{cases}1&\text{for $j\in\{1,\dots,R\}$}\\0&\text{for $j\in\{R+1,\dots,n\}$}\end{cases}$$ is primal feasible with objective value $\sum_{j=1}^R x_j$. The dual LP problem is to minimize $\sum_j \alpha_j+R\beta$ subject to \begin{align} \alpha_j +\beta &\ge x_j && (\text{dual $z_j \ge 0$})\\ \alpha_j &\ge 0 \end{align} Taking $\beta=x_R$ and $$\alpha_j=\begin{cases}x_j-x_R&\text{for $j\in\{1,\dots,R\}$}\\0&\text{for $j\in\{R+1,\dots,n\}$}\end{cases}$$ is dual feasible (this uses the fact that $x_j$ is nonincreasing) with objective value $$\sum_{j=1}^n \alpha_j+R\beta =\sum_{j=1}^R \alpha_j+\sum_{j=R+1}^n \alpha_j+R\beta =\sum_{j=1}^R (x_j-x_R)+\sum_{j=R+1}^n 0+R x_R =\sum_{j=1}^R x_j,$$ which matches the primal objective value, certifying optimality of this primal-dual pair.