Consider a set of linearly independent vectors $v_1, \cdots, v_p \in \mathbb{R}^n$.
Let $u_1, \cdots, u_p \in \mathbb{R}^n$ be the orthogonal vectors obtained from the Gram-Schmidt process (without normalization):
$$u_1 = v_1,$$ $$ u_2 = v_2 - \frac{\langle v_2,u_1 \rangle}{\langle u_1,u_1 \rangle}u_1,$$ $$\vdots $$ $$ u_p = v_p - \frac{\langle v_p,u_1 \rangle}{\langle u_1,u_1 \rangle}u_1 - \cdots - - \frac{\langle v_p,u_{p-1} \rangle}{\langle u_{p-1},u_{p-1} \rangle}u_1,$$
I'm calling the scalars $$\frac{\langle v_i,u_{j} \rangle}{\langle u_{j},u_{j} \rangle},$$
with $i > j$, the Gram-Schmidt coefficients. Clearly, these coefficients don't always have absolute value $\le 1$. Consider
$$ v_1 = \begin{bmatrix} 1 \\ 0\end{bmatrix}, \quad v_2 = \begin{bmatrix} 100 \\ 1\end{bmatrix},$$
in which $$\frac{\langle v_2,u_{1} \rangle}{\langle u_{1},u_{1} \rangle} = 100.$$
However, switch the order to
$$ v_1 = \begin{bmatrix} 100 \\ 1\end{bmatrix}, \quad v_2 = \begin{bmatrix} 1 \\ 0\end{bmatrix},$$
and now $$\frac{\langle v_2,u_{1} \rangle}{\langle u_{1},u_{1} \rangle} \approx \frac{1}{100}.$$
My question is: is it possible to change the order of $v_1,\cdots,v_p$ such that
$$\left\vert \frac{\langle v_i,u_{j} \rangle}{\langle u_{j},u_{j} \rangle} \right\vert \le 1$$
for all $j < i \le p$?
Yes, this is always possible. We need a bunch of induction though.
First, consider linearly independent vectors $v_0, v_1, \ldots, v_m$, and let $$v'_i = v_i - \frac{\langle v_i, v_0\rangle}{\langle v_0, v_0\rangle}v_0$$ for $i = 1, \ldots, m$. Note that $v'_1, \ldots, v'_m$ are linearly independent too, as $$\alpha_1 v'_1 + \ldots + \alpha_m v'_m = 0 \implies \alpha_1 v_1 + \ldots + a_m v_m - \underbrace{\phantom{_\text{some stuff}}}_{\text{some stuff}}v_0 = 0,$$ hence the coefficients, including $\alpha_1, \ldots, \alpha_m$, must all be $0$.
Apply Gram-Schmidt to $v_0, \ldots, v_m$ and $v'_1, \ldots, v'_m$ respectively to get $u_0, \ldots, u_m$ and $u'_1, \ldots, u'_m$. I claim that $u'_i = u_i$ for $i = 1, \ldots, m$. To see this, we prove by strong induction on $i$. If $i = 1$, then $$u_1 = v_1 - \frac{\langle v_1, u_0\rangle}{\langle u_0, u_0\rangle} u_0 = v_1 - \frac{\langle v_1, v_0\rangle}{\langle v_0, v_0\rangle} v_0 = v'_1 = u'_1.$$ Suppose $1 \le k \le m$ and $u'_i = u_i$ is true for $i = 1, \ldots, k$. Then \begin{align} u'_{k+1} &= v'_{k+1} - \sum_{i=1}^k \frac{\langle v'_{k+1}, u'_i\rangle}{\langle u'_i, u'_i\rangle} u'_i \\ &= v_{k+1} - \frac{\langle v_{k+1}, v_0\rangle}{\langle v_0, v_0\rangle}v_0 - \sum_{i=1}^k \frac{\langle v_{k+1} - \frac{\langle v_{k+1}, v_0\rangle}{\langle v_0, v_0\rangle}v_0, u_i\rangle}{\langle u_i, u_i\rangle} u_i \\ &= v_{k+1} - \sum_{i=1}^k \frac{\langle v_{k+1}, u_i\rangle}{\langle u_i, u_i\rangle} u_i - \frac{\langle v_{k+1}, v_0\rangle}{\langle v_0, v_0\rangle}\left(v_0 - \sum_{i=1}^k \frac{\langle v_0, u_i\rangle}{\langle u_i, u_i\rangle} u_i\right). \end{align} Note that $v_0 - \sum_{i=1}^k \frac{\langle v_0, u_i\rangle}{\langle u_i, u_i\rangle} u_i$ is the result of projecting $v_0$ onto $\operatorname{span}\{u_1, \ldots, u_m\}$, all of which are orthogonal to $u_0 = v_0$ (because Gram-Schmidt works as advertised). This projection must be $0$, hence $$u'_{k+1} = v_{k+1} - \sum_{i=1}^k \frac{\langle v_{k+1}, u_i\rangle}{\langle u_i, u_i\rangle} u_i - 0 = u_{k+1},$$ as required. Thus, by induction, $u'_i = u_i$ for all $i = 1, \ldots, m$.
Consider, in particular, the Gram-Schmidt coefficients for these two lists of vectors. For $1 \le i < j \le m$, we have $$\frac{\langle v'_j, u'_i \rangle}{\langle u'_i, u'_i\rangle} = \frac{\langle v_j - \frac{\langle v_j, v_0\rangle}{\langle v_0, v_0\rangle}v_0, u_i \rangle}{\langle u_i, u_i\rangle} = \frac{\langle v_j, u_i \rangle}{\langle u_i, u_i\rangle} - \frac{\langle v_j, v_0\rangle}{\langle v_0, v_0\rangle} \cdot \frac{\langle v_0, u_i \rangle}{\langle u_i, u_i\rangle} = \frac{\langle v_j, u_i \rangle}{\langle u_i, u_i\rangle},$$ since once again, $v_0 = u_0 \perp u_i$.
Of course, this says nothing about the Gram-Schmidt coefficients in the form $$\alpha_i := \frac{\langle v_i, u_0\rangle}{\langle u_0, u_0\rangle} = \frac{\langle v_i, v_0\rangle}{\langle v_0, v_0\rangle}$$ for $1 < i \le m$. However, the remaining Gram-Schmidt coefficients remain the same.
So, using induction again, it suffices to show that we can always rearrange $v_0, \ldots, v_m$ so that $|\alpha_i| \le 1$. Indeed, finding a single $v_j$ to map to $v_0$ suffices. The remaining $v'_1, \ldots, v'_m$ can be arranged so that the Gram-Schmidt coefficients are within $[-1, 1]$, and these coefficients match that of $v_0, \ldots, v_m$.
Which $v_0$ do we choose? No need to complicate things: just choose $v_0$ to be the vector from the list with the greatest norm. If $v_0$ has the greatest norm, then, by Cauchy-Schwarz, $$|\alpha_i| = \left|\frac{\langle v_i, v_0\rangle}{\langle v_0, v_0\rangle}\right| = \frac{|\langle v_i, v_0\rangle|}{\|v_0\|^2} \le \frac{\|v_i\|\|v_0\|}{\|v_0\|^2} = \frac{\|v_i\|}{\|v_0\|} \le 1.$$ And we are done!
So, the way to find this ordering is to start with the vector of greatest length, project the remaining vectors onto its orthogonal complement (which is our $v'_1, \ldots, v'_m$), then rinse and repeat until you have only a single vector left.