Let $d \in \mathbb{N}$ and let $I$ be a set. Let $\omega : I^d \times I^d \to \mathbb{R}$ be a function, denoted by $(a_1,\dotsc,a_d,b_1,\dotsc,b_d) \mapsto a_1 \cdots a_d | b_1 \cdots b_d$, with the following properties:
- It is antisymmetric in the first $d$ variables, e.g. $a_1 a_2 \cdots a_d | \cdots = - a_2 a_1 \cdots a_d | \cdots$.
- It is also antisymmetric in the last $d$ variables.
- For all elements $a_1,\dotsc,a_{d-1}$ and $b_0,\dotsc,b_d$ of $I$ we have $$\sum_{k=0}^{d} (-1)^k a_1 \cdots a_{d-1} b_k | b_0 \cdots \widehat{b_k} \cdots b_n=0.$$
One might call $\omega$ a Plücker function because these relations resemble the Plücker relations.
Claim. $a_1 \cdots a_d | b_1 \cdots b_d = b_1 \cdots b_d |a_1 \cdots a_d$ for all $(a,b) \in I^d \times I^d$.
For $d=1$ it is clear. Here is a proof for the case $d=2$: Using the relation $ab|cd- ac|bd + ad|bc=0$ four times, we get
$$ab|cd = ac|bd - ad|bc=da|bc-ca|bd = db|ac-dc|ab-cb|ad+cd|ab$$ $$=bc|ad-bd|ac+2 cd|ab = ba|cd + 2 cd|ab ~ \Longrightarrow ~ 2 ab|cd = 2 cd|ab ~~~\square$$
In the case $d=3$, a long calculation shows $abc|def=ade|bcf-adf|bce+aef|bcd$. So this already puts $bc$ on the right, but I don't know how to do this with $a$ without destroying this.
There is some background for the claim, coming from categorified Grassmannians, but I won't explain this here because it is not necessary to understand the question, I think. It maybe that the claim is false, but then I am pretty sure (but have no proof) that a weaker version of it holds, but it needs even more variables and relations. I will add this in case someone asks.
Perhaps one can ask a computer algebra software to do the whole work? I have tried it with SAGE, but it didn't work out because non-commutative quotient rings are only available in conjunction with a representing system.
The claim is true.
I find it convenient to state the problem in a "Boolean hypercube" form . The variables will be the $\binom{2d}{d}$ strings of length $2d$ over the alphabet $\{L,R\}$ with exactly $d$ Rs. For any string $x\in\{L,R\}^{2d}$ with exactly $d+1$ $R$'s, let $\partial(x)$ denote the sum of the variables labelled by strings obtained by replacing one $R$ in $x$ by an $L$. For example $\partial(LRLRRR)= LLLRRR+LRLLRR+LRLRLR+LRLRRL$.
Claim 2: $L^dR^d-(-1)^d R^dL^d$ is a linear combination of expressions of the form $\partial(x)$.
To relate this to your problem, fix $2d$ elements $e_1,\dots,e_{2d}\in I$. Consider a string $x\in\{L,R\}^{2d}$ with $d$ Rs. We can label the $i$'th L as $L_i$ and the $i$'th $R$ as $R_i$, so for example we annotate $LRLLRR$ as $L_1R_1L_2L_3R_2R_3$. For each $1\leq i\leq d$ let $l_i$ be the position of $L_i$, and let $r_i$ be the position of $R_i$. (So $l_i$ and $r_i$ are integers from $1$ to $2d$.) We map $x$ to $(-1)^{\sigma(x)}e_{l_1}\dots e_{l_d}|e_{r_1}\dots e_{r_d}$, where $(-1)^{\sigma(x)}$ is the sign of the permutation $\pi_x$ of $\{L_1,\dots,L_d,R_1,\dots,R_d\}$ sending the $i$'th letter of $L_1\dots L_dR_1\dots R_d$ to the $i$'th letter of the annotated version of $x$. (Equivalently, $\sigma(x)$ is the minimum number of transpositions between $x$ and $L^dR^d$.)
For example $LRLLRR$ maps to $e_1e_3e_4|e_2e_5e_6$ while $LRLRLR$ maps to $(-1)e_1e_3e_5|e_2e_4e_6$.
$\partial (L^{d-1}R^{d+1})$ gets mapped to $$ \sum_{k=0}^{d} (-1)^k e_{1} \cdots e_{d-1} e_{d+k} | e_{d} \cdots \widehat{e_{d+k}} \cdots e_{2d} $$ The mapping for other $\partial(x)$ terms is similar but much more cumbersome to write down.
Proof of Claim 2: we will argue that $$L^dR^d-(-1)^d R^dL^d=\sum_{k=1}^d (-1)^{k+1}\frac{(k-1)!(d-k)!}{d!} \sum_{x,y} \partial (xy)\tag{*}$$ where $x$ ranges over strings of length $d$ with exactly $k$ Rs, and $y$ ranges over strings of length $d$ with exactly $d-k+1$ Rs.
As an example, consider the case $d=2$. Then $$LLRR-RRLL = (\partial(RLRR)+ \partial(LRRR) - \partial(RRLR) - \partial(RRRL))/2.$$ The coefficient of LLRR on the right-hand-side is $(1+1)/2$, and the coefficient of RRLL is $(-1-1)/2$. The coefficient of LRLR is $(1-1)/2$ - the contributions come from LRRR and RRLR. The other coefficients can also be checked directly, but the following symmetry argument shows that this is unnecessary.
The group $S_d\times S_d$ acts on strings of length $2d$ by $(\pi,\pi')*(x,x')=\pi(x)\pi'(x')$, where $x,x'\in\{L,R\}^d$. The left-hand-side and right-hand-side of (*) are manifestly invariant under this group (technically, the induced action of $S_d\times S_d$ on the vector space generated by strings of length $2d$).
So it suffices to check the coefficients of the terms $L^kR^{d-k}L^{d-k}R^k$ match, for $0\leq k\leq d$. We get contributions from $\partial(x)$ where $x$ is obtained by replacing one L in $L^kR^{d-k}L^{d-k}R^k$ by an R. So there are two types of contributions: $k$ contributions from the terms $\partial(xL^{d-k}R^k)$ where $x$ is obtained by replacing one L in $L^kR^{d-k}$ by an R, and $d-k$ contributions from the terms $\partial(L^kR^{d-k}y)$ where $y$ is obtained by replacing one L in $L^{d-k}R^k$ by an R. The coefficient of $L^kR^{d-k}L^{d-k}R^k$ is therefore $$(-1)^{k+1}k\frac{(k-1)!(d-k)!}{d!}+(-1)^{k+2}(d-k)\frac{k!(d-k-1)!}{d!}$$ with the convention $0\cdot \infty=0$, i.e. ignore the first term if $k=0$, and ignore the second term if $k=d$.