If $A$ is a full-rank matrix with $n-2$ rows and $n$ columns, and $x,y\in\ker A$ are orthogonal, then there exists a real $\lambda=\lambda(A,x,y)$ such that for all $1\le i<j\le n$ we have
$$ \det \begin{pmatrix} x_i & x_j \\ y_i & y_j \end{pmatrix} = (-1)^{i+j} \lambda \det A_{ij}, $$
where $x=(x_1,\dotsc x_n)$, $y=(y_1,\dotsc y_n)$, and $A_{ij}$ is the square matrix obtained from $A$ by removing columns $i$ and $j$.
I have a reasonably simple proof of this fact, but I wonder whether there is really simple proof, or better still a "conceptual explanation". Beyond this, I wonder whether the straightforward extension onto matrices of size $(n-k)\times n$ holds true.
Thanks!