Let $A = \begin{bmatrix}a_1 | \cdots|a_n \end{bmatrix} \in\mathbb{R}^{n\times n}$. Is it true that
$$\sigma_n(A) = \min_{1\leq j\leq n} \min_{\alpha\in\mathbb{R}^{n}}\|a_j - \sum_{i=1,i\neq j}^{n}\alpha_i a_i\|_2?$$
If it is not true, then what would be a counterexample?
It is true in some cases, e.g. when $A$ is singular or when $A$ is a scalar multiple of an orthogonal matrix, but in general it is false. The quantity $\min\limits_{\mathbf c\in\mathbb R^n}\left\|a_j-\sum_{i\ne j}c_ia_i\right\|_2$ is the norm of the component of $a_j$ orthogonal to other columns of $A$. Now consider $$ A=\pmatrix{2&1\\ 1&1}. $$ The component of $a_1$ in $\{a_2\}^\perp$ is $a_1-\frac{\langle a_1,a_2\rangle}{\|a_2\|_2^2}a_2=\pmatrix{2\\ 1}-\frac32\pmatrix{1\\ 1}=\pmatrix{\frac12\\ -\frac12}$. Its norm is $\frac{1}{\sqrt{2}}$.
The component of $a_2$ in $\{a_1\}^\perp$ is $a_2-\frac{\langle a_1,a_2\rangle}{\|a_1\|_2^2}a_1=\pmatrix{1\\ 1}-\frac35\pmatrix{2\\ 1}=\pmatrix{-\frac15\\ \frac25}$. Its norm is $\frac{1}{\sqrt{5}}$.
However, $\sigma_2(A)=\frac{3-\sqrt{5}}{2}\ne\min(\frac{1}{\sqrt{2}},\frac{1}{\sqrt{5}})$.
From another perspective, consider a generic square matrix $A$ whose singular values are all distinct and positive. Let $A=USV^T$ be a singular value decomposition. Then $$ \min_{1\le j\le n}\min\limits_{\mathbf c\in\mathbb R^n}\left\|a_j-\sum_{i\ne j}c_ia_i\right\|_2 =\min\limits_{\substack{\mathbf x\in\mathbb R^n\\ x_j=1\text{ for some }j}}\left\|Ax\right\|_2 =\min\limits_{\substack{\mathbf x\in\mathbb R^n\\ x_j=1\text{ for some }j}}\left\|SV^Tx\right\|_2 \ge\sigma_n(A). $$ Since $S$ or $A$ have distinct positive singular values, in order that equality holds in the above, the minimiser $x$ must satisfy $V^Tx=\pm e_n$. Hence $x=\pm Ve_n$ is, up to a sign, a column of $V$. But then the constraint that $x_j=1$ for some $j$ implies that this column of $V$ is equal to $\pm e_j$. That is, some column of $V$ is (up to a sign) a vector in the standard basis. Obviously this condition is not always satisfied.