I've been trying to solve this linear algebra problem:
You are given $n^2 > 1$ pairwise different real numbers. Show that it's always possible to construct with them a basis for $\mathbb{R^n}$.
The problem seems to be intuitive enough, but I couldn't come up with a solution. I tried using the Leibniz formula for determinants, but I can't argument why I should be able to always arrange the numbers in such a way so that $det \ne 0$.
I also thought about first ordering the $n^2$ numbers, and then filling up a $n \times n$ matrix in a specific pattern, but I also couldn't close that argument.
Anyway, any help in the right direction would be appreciated :)!
We may prove the statement by mathematical induction. The base case $n=2$ is easy and we shall omit its proof. Suppose $n>2$. We call the target matrix $A$ and we partition it in the following way: \begin{align*} A=\left[ \begin{array}{ccccc} \pmatrix{|\\ |\\ \mathbf v_1\\ | \\ |} &\pmatrix{|\\ |\\ \mathbf v_2\\ | \\ |} &\cdots &\pmatrix{|\\ |\\ \mathbf v_n\\ | \\ |}\\ a_{n1}&a_{n2}&\cdots&a_{nn} \end{array} \right] \end{align*} where each $\mathbf v_j=(a_{1j},a_{2j},\ldots,a_{n-1,j})^\top$ is an $(n-1)$-dimensional vector.
By induction hypothesis, we may assume that the entries of the submatrix $M_{n1}=[\mathbf v_2,\ldots,\mathbf v_n]$ have been chosen so that $M_{n1}$ is nonsingular. Therefore, by deleting some row $\color{red}{k}$ of the submatrix $[\mathbf v_{\color{red}{3}},\ldots,\mathbf v_n]$, one can obtain an $(n-2)\times(n-2)$ nonsingular submatrix.
What does that mean? It means that by varying the choice of $a_{\color{red}{k}1}$, we can always pick $\mathbf v_1$ so that with $M_{n2}=[\mathbf v_1, \mathbf v_3,\ldots,\mathbf v_n]$, we have $\det M_{n2}\ne-\det M_{n1}$.
It remains to pick the entries of the last row of $A$ from the $n$ numbers left. By Laplace expansion, $(-1)^{n+1}\det A$ is equal to $$ a_{n1}\det M_{n1} - a_{n2}\det M_{n2} + \ldots\tag{1} $$ where the ellipses denote other summands that do not involve $a_{n1}$ or $a_{n2}$. If we swap the choices of $a_{n1}$ and $a_{n2}$, the signed determiant becomes $$ a_{n2}\det M_{n1} - a_{n1}\det M_{n2} + \ldots\tag{2} $$ instead. Since the difference between $(1)$ and $(2)$ is $(a_{n1}-a_{n2})(\det M_{n1}+\det M_{n2}) \ne 0$, we see that at least one set of choices would make $\det A$ nonzero.