$$\Delta(v_{\sigma(1)},\dots,v_{\sigma(n)})=\varepsilon(\sigma)\cdot\Delta(v_1,\dots,v_n)$$ Here is the equality of determinant function, where $\sigma_{(i)}$ = permutation of i-th position and $\epsilon(\sigma)$ is sign function of permutation.
Here is my idea: supose we have $\bigtriangleup(V_{\sigma_{(1)}},V_{\sigma_{(2)}})$. There are only 2 variants: $\bigtriangleup(V_1,V_2)$(if no permutations occured) or $\bigtriangleup(V_2,V_1)$ (if permutation occured).
And also we know that: $\bigtriangleup(V_1,V_2)$ = - $\bigtriangleup(V_2,V_1)$
$\epsilon(\sigma)$ is -1, when number of elementary transpositions is odd, else $\epsilon(\sigma)$ is 1. If no permutation occurs, then $\epsilon(\sigma)$ is 1(as in the first variant, when we got $\bigtriangleup(V_1,V_2)$). In other situation($\bigtriangleup(V_2,V_1)$) sign is -1.
So please give me a hint how can I formalize and generilize my thoughts?
Edited:
My idea(2): wir know, that $$\bigtriangleup(V_1,,V_{i},...,V_j,...,V_n)$ = - \bigtriangleup(V_1,,V_{j},...,V_i,...,V_n) (1)$$
In the left term can occur even or odd number of adjacent permutation. If even - then by (1) we get the positive term. If odd - than we get negative term. In order to make $$\Delta(v_{\sigma(1)},\dots,v_{\sigma(n)})=\Delta(v_1,\dots,v_n)$$ true we need to multiple R.H.S by sign(which is by definition the number of adjacent permutation)
Consider that permuting columns in a matrix amounts to right-multiply this matrix by a so-called permutation matrix that "encodes" $\sigma$. Here is an example for a $3 \times 3$ matrix where $\sigma$ is the transposition of column 2 and column 3, column 1 being unchanged:
$$\tag{1}\begin{pmatrix}a&d&g\\b&e&h\\c&f&i\end{pmatrix}\underbrace{\begin{pmatrix}1&0&0\\0&0&1\\0&1&0\end{pmatrix}}_{P_{\sigma}}=\begin{pmatrix}a&g&d\\b&h&e\\c&i&f\end{pmatrix}$$
Then it suffices to equate the determinants of LHS and RHS of (1) to obtain, in this case:
$$det(V_{1},V_{2},V_{3}) \ \underbrace{det\begin{pmatrix}1&0&0\\0&0&1\\0&1&0\end{pmatrix}}_{=-1=sign(\sigma)} \ \ = \ \ det(V_{\sigma_{(1)}},V_{\sigma_{(2)}},V_{\sigma_{(3)}})$$
Edit: To make the picture complete, let me recall that a $3 \times 3$ determinant can be given the expression $\sum_{\sigma \in S_3} sign(\sigma)a_{1,\sigma(1)}a_{2,\sigma(2)}a_{3,\sigma(3)}$, which means a sum of 6 terms ($3!=6$).
For a permutation matrix, this sum is in fact reduced to a single term which is is $sign(\sigma)1 \times 1 \times 1 = sign(\sigma)$. This is why I have said that a permutation matrix fully encodes a permutation.
If now we do two successive permutations, it is equivalent to right-multiply the initial matrix first by $P_{\sigma_1}$, then by $P_{\sigma_2}$ which is easily proven to be $P_{\sigma_2 \circ \sigma_1}$.