Consider the following $\max-\min$ integer programming formulation expressed in the binary decision variable $\mathbf{z}$:
$$\begin{align*} \max&m \\ s.t.&\\ m \leq& s_i + \sum_j^J \sum_k^K p^{j,k} z_i^{j,k} \quad \forall i=1, \ldots, I \\ \sum_i^I \sum_j^J \sum_k^K z_i^{j,k} \leq& B \\ \sum_i^I z_i^{j,k} \leq& 1 \quad \forall j=1,\ldots,J, \ \forall k=1,\ldots,K \\ \sum_k^K z_i^{j,k} \leq& 1 \quad \forall j=1,\ldots,J, \ \forall i=1, \dots, I \\ m \in& \mathbb{R} \\ \mathbf{z}\ &\text{is binary} \end{align*}$$ where $s_i$, $p^{j,k}$ and $B$ are constants.
Let us now apply Lagrangian multipliers in order to relax the first constraint, i.e., the bottleneck constraint that upper-bounds the dummy variable $m$. As such, the relaxed formulation is as follows:
$$\begin{align*} \max (1 - \sum_i^I \alpha_i)m +& \sum_i^I \sum_j^J \sum_k^K \alpha_i p^{j,k} z_i^{j,k} \\ s.t.\\ \sum_i^I \sum_j^J \sum_k^K z_i^{j,k} \leq& B \\ \sum_i^I z_i^{j,k} \leq& 1 \quad \forall j=1,\ldots,J, \ \forall k=1,\ldots,K \\ \sum_k^K z_i^{j,k} \leq& 1 \quad \forall j=1,\ldots,J, \ \forall i=1, \dots, I \\ m \in& \mathbb{R} \\ \mathbf{z}\ &\text{is binary} \end{align*}$$
Let $L(\mathbf{\alpha})$ denote the Lagrangian dual function. The objective is to find the optimal multiplier $\mathbf{\alpha}^*$ so as to minimize the function $L(\cdot)$:
$$\begin{align*} &\min\{L(\mathbf{\alpha}): \mathbf{\alpha} \geq 0\} = \\ &\min\{\max\{(1 - \sum_i^I \alpha_i)m + \sum_i^I \sum_j^J \sum_k^K \alpha_i p^{j,k} z_i^{j,k}: \mathbf{z} \in Z, \mathbf{z}\text{ is binary}, m \in \mathbb{R}\}: \mathbf{\alpha} \geq 0\} \end{align*}$$ where $Z$ denotes the feasible region.
To this end, a subgradient algorithm can be used. Unfortunately, it may be the case that during the execution of the algorithm the multiplier $\mathbf{\alpha}$ takes value such that $(1 - \sum_i^I \alpha_i) > 0$, and the optimal solution for the relaxed problem is attained when $m = \infty$. In such a case, it is not possible to continue the execution of the subgradient algorithm. Hence, the Lagrangian dual problem has to be revised as follows:
$$\begin{align*} \min\{L(\mathbf{\alpha}): (1 - \sum_i^I \alpha_i) \leq 0, \mathbf{\alpha} \geq 0\} \end{align*}$$
How can I solve the resulting constrained Lagrangian dual problem? Thanks.
First, let us revise the constrained Lagrangian dual problem so as to nullify the dummy variable $m$:
$$\begin{align*} \min\{L(\mathbf{\alpha}): \sum_i^I \alpha_i = 1, \mathbf{\alpha} \geq 0\} \end{align*}$$
Hence, the Langrangian dual problem reduces to: $$\begin{align*} &\min\{\max\{\sum_i^I \sum_j^J \sum_k^K \alpha_i p^{j,k} z_i^{j,k}: \mathbf{z} \in Z, \mathbf{z}\text{ is binary}, m \in \mathbb{R}\}: \sum_i^I \alpha_i = 1, \mathbf{\alpha} \geq 0\} \end{align*}$$
which can be solved by inspection for any specific Lagrangian multiplier value $\bar{\mathbf{\alpha}}$.
Second, let us now apply the subgradient algorithm to find the optimal Lagrangian multiplier $\mathbf{\alpha}^*$. To this end, let $\Lambda$ denote the feasible region of the Lagrangian dual problem, i.e., $\Lambda = \{\mathbf{\alpha}: \sum_i^I \alpha_i = 1, \mathbf{\alpha} \geq 0\}$. Unfortunately, it may happen that during the $k$-th iteration, the Lagrangian multiplier $\mathbf{\alpha}^k$ takes value outside the feasible region $\Lambda$. In such a case, an additional step is required so as to project onto $\Lambda$ the point $\mathbf{\alpha}^k$, i.e., finding the unique closest point $\hat{\mathbf{\alpha}}^k \in \Lambda$ to $\mathbf{\alpha}^k$. Let $P_{\Lambda}(\alpha)$ denote the projection operator, which is defined as follows:
$$\begin{align*} P_{\Lambda}(\mathbf{\alpha}) = \arg\min\{\parallel \alpha - \hat{\alpha} \parallel: \hat{\alpha} \in \Lambda\} \end{align*}$$
It is worth noting that the projection operation affects the computational viability of the whole subgradient algorithm. Fortunately, if the feasible region $\Lambda$ is defined by a constraint in the form $c^t\alpha = b$, the projected point $\hat{\alpha}$ is relatively easy to obtain (see Bazaraa and Sherali)