In the book of Linear Algebra by Werner Greub, at page 196, it is given that
In an 2 dimensional oriented inner product space, let $\Delta$ be a normed determinant function then $$|x|^2 |y|^2 - (x,y)^2 = \Delta(x,y)^2$$
But how did we get this equation? I mean I do not remember any equation that includes norm, inner product and determinant function at the same time.
Note that, from the previous section, we get that $$\Delta(e_1,e_2)^2 = \det(\langle e_i, e_j\rangle) \quad i,j = 1,2,$$ and $e_i$ is an orthogonal basis.
Edit: As @Spencer pointed out, RHS does not have to be zero.
Edit: The notation $\det(\langle e_i, e_j\rangle)$ means that the determinant of the matrix whose i,j -th entry is $(e_i, e_j)$.
A determinant function on a two-dim space $E$ is a map: $\Delta: E\times E\rightarrow {\Bbb R}$ (or ${\Bbb C}$) which is bilinear (linear in both variables) and antisymmetric, i.e.: $\Delta(x,y)=-\Delta(y,x)$, $x,y\in E$.
A pedestrians way to the goal: Let $e_1,e_2$ be an orthonormal base in $E$ and write: $x=x_1e_1+x_2e_2$, $y=y_1e_1+y_2e_2$. By anti-symmetry of $\Delta$ we have $\Delta(e_1,e_2)=-\Delta(e_2,e_1)$ and $\Delta(e_1,e_1)=\Delta(e_2,e_2)=0$, so using bi-linearity:
$$ \Delta(x,y) = \Delta(x_1e_1+x_2e_2,y_1e_1+y_2e_2)= (x_1 y_2-x_2 y_1) \Delta(e_1,e_2)$$ This is unique up to a multiplicative constant (i.e. the value of $\Delta(e_1,e_2)$). By pure algebra: $$ |x|^2 |y|^2 - (x,y)^2 = \left[ (x_1^2+x_2^2)(y_1^2+y_2^2) -(x_1y_1+x_2y_2)^2\right]= (x_1y_2-x_2y_1)^2$$ and you get $$ |x|^2 |y|^2 - (x,y)^2 = \Delta(x,y)^2$$ provided that you normalize (whence the word "normed") so that $\Delta(e_1,e_2)=\pm 1$, corresponding to declaring the area spanned by two orthonormal vectors to be $\pm 1$.
A more conceptual approach that also works in any dimension $n$ goes as follows:
Let $M_{ij}=(x_i,x_j)$ be the matrix of the possible scalar products of the $x_i$'s with themselves. Also let $A_{kj}=(e_k,x_j)$ be the projection of $x_j$ onto $e_k$ so that $$ x_j = \sum_k e_k (e_k,x_j) = \sum_k e_k A_{kj}$$ By linearity of $\Delta$ one has: $$ \Delta(x_1,...,x_n) = \Delta( \sum_{k_1} e_{k_1} A_{k_1,1 },..., \sum_{k_n} e_{k_n} A_{k_n,n }) = \sum_{k_1}\cdots \sum_{k_n} \Delta(e_{k_1},...,e_{k_n}) A_{k_1,1} ... A_{k_n,n}$$ On the RHS, by anti-symmetry, only terms with distinct $k_1,...,k_n$ will survive. Thus the sum is over all permutations of $\{1,...,n\}$ and, again by anti-symmetry, the value of $\Delta(e_{k_1},...,e_{k_n})$ is the sign of the permutation times $\Delta(e_1,...,e_n)$ , i.e. we get: $$ \Delta(x_1,...,x_n) = \Delta(e_1,...,e_n) \det(A) . $$ Since $M_{ij} = \sum_k (x_i,e_k)(e_k,x_j) = \sum_k A'_{ik} A_{kj}$ we get by the product rule for determinants: $$ \det M = \det (A' A) = \det (A)^2 = \Delta(x_1,...,x_n)^2$$ provided again that you normalize so that $\Delta(e_1,...,e_n)=\pm 1$