I am looking at the description of the Simplex algorithm.
Let $\overline{x_0}$ be a non degenerate basic feasible solution. We suppose that $\overline{x_0}=(x_{10}, x_{20}, \dots, x_{m0},0, \dots,0)$ with $x_{10}, x_{20}, \dots, x_{m0}>0$. Thus the first $m$ columns of $A$, i.e. $P_1, P_2, \dots, P_m$ are linearly independent (so they are a basis of $\mathbb{R}^m$). Thus for each $j$ it holds that $P_j= x_{1j}P_1+ x_{2j} P_2+ \dots + x_{mj} P_m$ for suitable $x_{1j}, x_{2j}, \dots, x_{mj} \in \mathbb{R} (1)$.
We know that $Ax_0=b \Leftrightarrow P_1 x_{10}+ P_2 x_{20} + \dots + P_{m0} x_{m0}=b (2)$
We fix a $j \in \{ m+1, \dots, n \}$.
$$(2)- \theta (1), \text{ where } \theta \in \mathbb{R}:$$
$$(x_{10} - \theta x_{1j})P_1 + (x_{20}- \theta x_{2j}) P_2+ \dots +(x_{m0}- \theta x_{mj})P_m+ \theta P_j=b (3)$$
$(3) \Rightarrow \overline{x}(\theta)=(x_{10}- \theta x_{1j}, x_{20}- \theta x_{2j}, \dots, x_{m0}- \theta x_{mj},0, \dots,0, \theta, 0,0, \dots)$ is a solution of the system of restrictions ($A \overline{x}(\theta)=b$)
In order to have $\overline{x}(\theta) \in F$ it has to hold
$$ x_{10} - \theta x_{1j}>0 , x_{20}- \theta x_{2j}>0 , \dots , x_{m0}- \theta x_{mj}>0, \theta>0 (4)$$
We notice that for $\theta$ small enough, $(4)$ holds and so there are feasible solutions with $m+1$ non-zero coordinates. So in order to find a new basic feasible solution we have zero one of the first $m$ coordinates of $\overline{x}(\theta)$ for an appropriate choice of $\theta$.
Then $\theta$ satisfies $(4)$ whenever $0< \theta< \theta_0$, where $\theta_0=\min_i \{ \frac{x_{i0}}{x_{ij}}: x_{ij}>0\}$.
We define $\theta=\theta_0$ an consider the corresponding vector $\overline{x}(\theta) \in F$.
Then one of the $m$ first components of $\overline{x}(\theta)$ is zero.
We suppose that $x_{10}- \theta x_{1j}=0$.
Then $\overline{x}(\theta)$ is a candidtae basic feasible solution and so that it is indeed such a solution , it suffices to show that $P_2, P_3, \dots, P_m, P_j$ are linearly independent.
How can we show that $P_2, P_3, \dots, P_m, P_j$ are linearly independent?
If we suppose that $P_2, P_3, \dots, P_m, P_j$ are linearly dependent then there are $\lambda_2, \lambda_3, \dots, \lambda_m, \lambda_j \in \mathbb{R}$ with $|\lambda_2|+ |\lambda_3|+ \dots+ |\lambda_m|+ |\lambda_j| \neq 0$ such that
$$\lambda_2 P_2+ \lambda_3 P_3+ \dots + \lambda_m P_m+ \lambda_j P_j=0$$
We also know that $A \overline{x}(\theta)=b \Rightarrow P_2 (x_{20}- \theta x_{2j})+P_3 (x_{30}- \theta x_{3j})+ \dots+ P_m (x_{m0}- \theta x_{mj})+P_j \theta=b$
Can we find a contradiction from these two relations? Or do we have to use something else?
How can we show that $P_2,P_3,…,P_m,P_j$ are linearly independent?
Outline:
From outline(3 and 4) $x_{1j} \ne 0$
So $P_j$ has a non zero $P_1$ component, outline(2).
Definition 1: Linear Independence: The vectors $v_1,v_2\dots v_m$ are linearly independent if and only if the only scalars $\alpha_1,\alpha_2\dots\alpha_m$ satisfying $\alpha_1 v_1 + \alpha_2 v_2\dots + \alpha_m v_m = 0$ are $\alpha_1 = \alpha_2 = \dots \alpha_m = 0$.
Theorem 1: Linear Independent Subset: If $V = \{v_1,v_2\dots v_m\}$ is a set of linearly independent vectors then any non empty subset of $V$ is a linearly independent set of vectors.
Need to show that the only solution to $\alpha_2P_2 + \dots + \alpha_mP_m + \alpha_jP_j = 0$ is $\alpha_2 = \dots \alpha_m = \alpha_j = 0$.
Substitute the equation for $P_j$ (outline(2)):
$\alpha_j x_{1j}P_1 + (\alpha_2 + \alpha_jx_{2j})P_2 + \dots + (\alpha_k + \alpha_jx_{kj})P_k + \dots + (\alpha_m + \alpha_jx_{mj})P_m = 0$
$P_1,P_2,…,P_m$ are linearly independent from outline(1) so:
$\alpha_j x_{1j} = 0$ and $\alpha_k + \alpha_jx_{kj} = 0$ for $k = 2\dots m$ are the only solution.
Since $x_{1j} \ne 0$ then $\alpha_j = 0$.
$\alpha_2 P_2 + \dots + \alpha_k P_k + \dots + \alpha_m P_m = 0$
Since $\{P_2,\dots,P_m \}$ is a subset of $\{P_1,P_2,\dots,P_m \}$ which are linearly independent then $\{P_2,\dots,P_m \}$ is a linearly independent set of vectors.
So $\alpha_k = 0$ for $k = 2\dots m$ are the only solution.
Since $\alpha_j = 0$ and $\alpha_k = 0$ for $k = 2\dots m$ are the only solution then $\{P_2,P_3,…,P_m,P_j\}$ are linearly independent.
"If we suppose that $P_2, P_3, \dots, P_m, P_j$ are linearly dependent $\dots$"
If $\{P_1,P_2,\dots,P_m \}$ is linearly independent and $x_{1j} \ne 0$ then $\{P_2, P_3, \dots, P_m, P_j\}$ is linearly independent.