A standard established way to construct the determinant is to first construct $\Lambda^p(V)$ and then observe that an endomorphism $A: V \rightarrow V$ induces $v_1 \wedge \ldots \wedge v_n \mapsto A(v_1) \wedge \ldots A(v_n)$ on $\Lambda^n(V)$ which reduces to $D v_1 \wedge \ldots \wedge v_n$ for a scalar $D$.
This is all fine, but in case of general vector spaces $V$, what rationale do we have for constructing $\Lambda^p(V)$ in first place?
For an inner product space on $\mathbb{R}^n$ with the induced norm and metric, all linear isometries are orthogonal transformations, and we can deeply show using e.g. algebraic topology, that $O(n)$ has exactly two connected components, as discussed in this excellent answer. We naturally interpret these as two classes of orientations.
However, even in a metric space $(\mathbb{R}^n,d)$ with $d$ induced by an arbitrary norm, in general, the isometry group is not $O(n)$, and may have a different number of connected components. At this point, the notion of 'orientation' becomes ambiguous and ceases to 'naturally lead us' to constructing $\Lambda^p(V)$.
Are there alternative, deeper, more revealing ways to think about, motivate and construct $\Lambda^p(V)$ for general $V$ as a first step of constructing the determinant, other than the mechanical and rather uninformative effort to construct 'volume with generalized orientation' ?
I believe they first appeared in the works of Grassmann. The idea is that linear subspaces of $\Lambda^p(V)$ correspond to $p$ dimensional subspaces of $V$ thus allowing one to turn the set of $p$ dimensional subspaces into a variety.