Let's suppose that we have $n-1$ vectors $v_1, v_2 ... v_{n-1}$ in $n$-dimensional space.
Now we want to have any non-zero vector $w$ which is orthogonal to all these vectors $v_i$.
If these $n-1$ vectors are linearly independent we have a simple formula for obtaining this vector i.e.
$ w = det {\begin{bmatrix}
e_1 & e_2 & ... & e_n \\
&v_1^T\\
&v_2^T \\
&..... \\
&v_{n-1}^T \end{bmatrix}} $
where $e_1, e_2, .. e_n$ are standard basis vectors (similar formula is used for 3-D cross product - in $2^{nd},3^{rd} .. n^{th}$ row we have as entries of the matrix components of vectors $v_i$ i.e. scalars but in the first row vectors are entries so the result of calculation of this determinant is a vector).
Question:
how to obtain any non-zero vector $w$ orthogonal to all vectors $v_i$ in the case when $v_1, v_2 ... v_{n-1}$ are linearly dependent or we simply don't know whether they are or are not linearly independent ? Can we achieve it with a use of other single formula ?
If you mean by formula some expression that is a continuous function of its arguments, then the answer is that this is impossible, for similar reasons to what I explained in this answer.
Suppose your $n-1$ vectors span a space of dimension $d<n-1$, then the space $S$ of possible vectors orthogonal to them has dimension $n-d>1$. Now if you take any subspace $L$ of dimension$~1$ in $S$, you can easily make that line to be the only set of possibilities by making a very small adjustment to your vectors (add small multiples of vectors in $S$ but orthogonal to $L$ to some of your vectors). By continuity, the vector of $S$ that your formula chooses must be arbitrarily close to any such line $L$, and the zero vector is the only one that satisfies this requirement.