I am trying to solve exercise 2.3 of the book "Introduction to linear optimization" by Bertsimas and Tsitsiklis, which states:
$\textbf{ Exercise 2.3 (Basic feasible solutions in standard form polyhedra with upper bounds)}$ Consider a polyhedron defined by the constraints $Ax = b$ and $0 \leq x \leq u$, $u\geq0$ and assume that the matrix has linearly independent rows.
Prove an analog of Theorem 2.4.
Where Theorem 2.4 is:
$\textbf{Theorem 2.4}$ Consider the constraints $Ax=b$ and $\geq 0$ and as sume that the $m \times n$ matrix $A$ has linearly independent rows. A vector $x \in \mathbb{R}^n$ is a basic solution if and only if we have $Ax = b$, and there exist indices $B (1) , . . . , B (m)$ such that:
$\bullet\ \text{The columns $A_{B(1)},...,A_{B(m)}$ are linearly independent} \\ \bullet\ \text{If $i \neq B(1),...,B(m)$ then $x_i=0$}$
My reformulation of theorem 2.4 would be the following:
$\textbf{Theorem}$ Consider the constraints $Ax = b$ and $0\leq x\leq d$ and assume that the $p \times n$ matrix $A$ has linearly independent rows. A vector $x \in \mathbb{R}^n$ is a basic solution if and only if we have Ax = b, and there exist indices $B (1) , . . . , B (p)$ such that:
$\bullet\ \text{The columns $A_{B(1)},...,A_{B(p)}$ are linearly independent;}\\ \bullet\ \text{If $i\neq B (1) , . . . , B (p)$, then $x_i = 0 $ or its respective upper bound $d_i$.}$
$\textit{Proof}$: Suppose that $x$ satisfies both conditions. Then it is true that: \begin{align} \begin{split} Ax = \sum_{i=1}^{p} A_{B(i)} x_{B(i)}+\sum_{i\neq B(1),...B(p)} A_i x_i = b\\ \sum_{i=1}^{p} A_{B(i)} x_{B(i)} =b-\sum_{i\neq B(1),...B(p)} A_i x_i \end{split} \end{align} Since the columns $A_{B(i)}$ $i=1,...,p$ are linearly independent, $x_{B(i)}$ $i=1,...,p$ are uniquely determined. By Theorem 2.2 , there are n linearly independent active constraints, and this implies that x is a basic solution.
For the converse, let us assume x is a basic solution. Let $B(1),...,B(k)$ all the indices such that $x_{B(i)} \neq 0$ and $x_{B(i)} \neq d_i$ for all $i=1,...,k$.
Since $x^*$ is a basic solution, the system of equations formed by the active constraints $\sum_{i=1}^{n} A_i x_i = b$, $x_i=0$, $x_ i=d_i$ $i \neq B(1),...B(k)$ has a unique solution (Theorem 2.2). This implies that the columns $A_{B(1)},...,A_{B(k)}$ are linearly independent and so $k\leq p$.
Since $rank(A) = p$, we can choose $p-k$ more columns so that the columns $A_{B(1)},...,A_{B(p)}$ are linearly independent. Moreover if $i \neq B(1),...,B(p)$, then it is also true that $i \neq B(1),...,B(k)$, since $k \leq p$ and $x_i=0$ or $x_i=d_i$.
Where
$\textbf{Theorem 2.2}$ Let $x^*$ be an element of $\mathbb{R}^n$ and let I={i : $a_i^{'}x_i=b_i$} be the set of indices of constraints that are active at $x^*$. Then, the following are equivalent:
$\bullet\ \text{There exist n vectors in the set ${a_i : i \in I}$ that are linearly independent;}\\ \bullet\ \text{The system of equations $a^{'}_ix=b_i$ has a unique solution}$
Is my reformulation (and proof) of theorem 2.4 correct?
Yes, I think yours is right.
My answer:
Procedure:
In order to find out the proper basic solution, there must be n linearly independent constraints that are active. Therefore, first we must ensure that all the constraints in Ax=b are active, and that's not enough. We still need another n-m constraints in order to get n linearly independent active constraints.
However, differing from the case of standard form, now we can see there are more choices for us to take when selecting the value of nonbasic variables. For active choice, we can either choose 0 or m for them to be. Finally, we come up with the following procedure:
Then we are going to prove a new theorem analogous to Theorem 2.4.
Theorem:Consider the constraints $Ax = b$ and $0 \leqslant x \leqslant u$ and assume that the $m$x$n$ matrix $A$ has linearly independent rows. A vector $x \in R^n$ is a basic solution if and only if we have $Ax = b$, and there exist indices $B(1),...,B(m)$ such that:
Proof:
Consider some $x \in R^n$ and suppose that there are indices $A_{B(1)},...,A_{B(m)}$ that satisfy (1) and (2) in the statement of the theorem.The active constraints $X_i = 0$ or $X_i = m$ forall i not in $I=\{B(1),...,B(m)\}$, and $Ax = b$ imply that $$ \sum_{i=1}^{m} \mathbf{A}_{B(i)} x_{B(i)}=\sum_{i=1}^{n} \mathbf{A}_{i} x_{i} - km=\mathbf{A} \mathbf{x} - km=\mathbf{b}- km = constant $$ Since the columns $A_{B(1)},...,A_{B(m)}$ are linearly independent,$X_{B(1)},...,X_{B(m)}$ are uniquely determined. Thus, the system of equations formed by the active constraints has a unique solution. By Theorem 2.2, there are n linearly independent active constraints, and this implies that x is a basic solution.
For the converse, we assume that x is a basic solution and we will show that conditions 1 and 2 in the statement of the theorem are satisfied.
Since x is a basic solution, the system of equations formed by the active constraints have a unique solution. Equivalently, the equation $\sum_{i=1}^{m}\mathbf{A}_{B(i)} x_{B(i)}=\mathbf{b}- km$ has a unique solution. It follows that the columns $A_{B(1)},...,A_{B(m)}$ are linearly independent, fitting the statement 1 and 2.