Let $e_1,\dots,e_n$ be the standard basis of $\Bbb Z^n$, and consider the standard lattice structure given by $\langle e_i,e_j\rangle=\delta_{i,j}$. Suppose we have a sublattice $L\subset \Bbb Z^n$ of rank $n-1$. Then its orthogonal complement $L^\perp$ is of rank $1$, so it will be generated by the primitive (unique up to sign) vector $\alpha\in L^\perp$. What can we say about $\det L$ and $|\alpha|^2$?
Here are some examples.
(1) $n=2$, $L=\langle e_1+e_2\rangle $. Then $\det L=2$, and $L^\perp=\langle e_1-e_2\rangle$ so $|\alpha|^2=2=\det L$.
(2) $n=3$, $L=\langle e_1+e_2, e_1-e_2+e_3\rangle$. Then $\det L=6$, and $L^\perp=\langle e_1-e_2-2e_3\rangle $, so again $|\alpha|^2=\det L$.
(3) But if there is a non-primitive generator on $L$, we don't have $|\alpha|^2=\det L$ in general. Example: $n=2$, $L=\langle 2e_1\rangle$. Then $\alpha=e_2$ and $|\alpha|^2=1\neq 2=\det L$.
(4) Even $L$ has a basis consisting of primitive vectors, we don't have $|\alpha|^2=\det L$ in general. Example: $n=4$, $L=\langle e_1-e_2, e_1+e_2+e_3+e_4, e_3-e_4\rangle$ satisfies $\det L=16$, and $\alpha=-e_1-e_2+e_3+e_4$, so $|\alpha|^2=4\neq 16$.
It seems that $|\alpha|^2$ divides $\det L$ in general. Also it seems that $|\alpha|^2=\det L$ if $\det L$ is squarefree. But I can't see how to prove this. Are these true?
I found this a very neat question, and I believe I have found an answer. Let me state precisely what I believe to be true:
$1$. In the setting above, we have that $|\alpha|^2$ divides $\det L$. If $\det L$ is square-free, $|\alpha|^2=\det L$.
$2$. If $n=2$ and the generator of $L$ is primitive, it is always the case that $|\alpha|^2=\det L$.
$3$. For any $n>2$, we have counter-examples to equality, even restricting to primitive generators.
$4$. For any $n$, we have examples of $\det L = |\alpha|^2$ even when $\det L$ is not square-free.
Let me start with the $n=2$ case. Here $L=\langle v\rangle$, $v\in\mathbb{Z}^2$, and let us assume $v$ is primitive. Then, if $w\in\mathbb{Z}^2$ satisifies $\langle v,w\rangle=0$, we get $v_1w_1+v_2w_2=0$, and as this is integral and $v$ is assumed primitive, we get $v_1|w_2$ and $v_2|w_1$. Then it is easy to see that the only primitive choice of $\alpha$ is $\alpha=(v_2,-v_1)^T$, up to a sign. Thus, it is clear that $$\det L = |v|^2 = |\alpha|^2.$$ As you showed in your $n=4$ counter-example, this does not hold in general. A (primitive) counter-example in $n=3$ is $L=\langle a,b\rangle$, $a=(0,1,1)^T$, $b=(2,1,3)^T$, which has $\det L=12$, but then $\alpha=(1,1,-1)^T$, which has $|\alpha|^2=3$.
Note also that, indeed, we see that $|\alpha|^2$ removes the square from $\det L$. This is no coincidence. If we let $S=(v_1,v_2)^T$ be the $3\times 2$ matrix with rows $v_1$ and $v_2$, we see that that $L^\perp = \ker S$, as $Sx = (\langle v_1,x\rangle,\langle v_2,x\rangle)^T$. For a moment thinking of $\mathbb{R}$-vector spaces, using the usual computation of the nullspace we see that $\ker S_\mathbb{R} = \operatorname{span}_\mathbb{R}((a_2b_3-a_3b_2,-(a_1b_3-a_3b_1),a_1b_2-a_2b_1)^T)$. As this spanning vector (let us call it $w$) is integral, we know that $w\in L^\perp\subset \mathbb{Z}^2$. Furthermore, it is an elementary calculation to see that $$|w|^2 =\det L.$$ The question is now, does it generate $L^\perp$? Only if it is primitive, and this is not always the case. Indeed, even if $a$ and $b$ are primitive, their coordinates can have non-trivial relations. In my $n=3$ example, we would get $w=(2,2,-2)^T=2\cdot \alpha$, which shows exactly how the square factor vanishes, as $|w|^2=|2\cdot \alpha|^2=4|\alpha|^2$.
This holds in full generality. In dimension $n+1$, with $L=\langle v_1,v_2,\dots,v_n\rangle$, forming $S=(v_1,\dots,v_n)^T$ still satisfies $\ker S=L^\perp$. Computing the nullspace gives us a formula for $w$,
$$w_i = \left((-1)^{i}\sum_{\sigma\in S_{n+1,i}}(-1)^{\operatorname{sign}(\sigma)}v_{1,\sigma(1)}v_{2,\sigma(2)}\cdots v_{i-1,\sigma(i-1)}v_{i,\sigma(i+1)}\cdots v_{n,\sigma(n+1)}\right)^T$$
This formula is very nasty, which is why I was so explicit in the $n=3$ example. The idea is, to compute $w_i$, you first compute the product of all the $v_j,k$ with $j=k$, except at the $i'th$ coordinate, you jump $k$ up $1$, and continue on with $k=j+1$. Then you permute the $k$'s and go again, and either subtract or add depending on the sign of the permutation. You go through all permutations of the set $\lbrace 1,2,\dots,i-1,i+1,\dots,n+1\rbrace$, which I denoted $S_{n+1,i}$ (it is obviously isomorphic to $S_n$).
It already looks similar to the Leibniz formula for the determinant, and indeed, we will get $\det L = |w|^2$.
Finally, if $\det L$ is squarefree, then $w$ will have to be primitive, otherwise $|w|^2 = |w'|^2\cdot k^2$, where $w=k\cdot w'$, $k\in\mathbb{Z}$. Thus $w=\alpha$, and we are done.
An example with equality even with non-squarefree determinant is the following: $n=3$ and $a=(1,-7,0)^T$, $b=(0,0,1)^T$. Then $\alpha=(7,1,0)^T$, and everything is primitive, so $$\det L = 50 = 2\cdot 5^2 = |\alpha|^2.$$
This will in fact give an example for all $n$, as you simply pick $v_1=(1,-7,0,\dots,0)$, $v_2=e_3$, ..., $v_n=e_{n+1}$, and you get $\alpha=(7,1,0,\dots,0)$, the same situation as above.
To give counterexamples (with necessarily non-squarefree determinant), my claim is that we have enough freedom in our choice of $v_1,\dots,v_n$ to choose their coordinates such that they are primitive, but all coordinates of $w$ share a common factor. Then $w$ is not primitive, and $\det L=|\alpha|^2$ fails. I did not prove this, but it expect it to be doable.
What I would really like is a criteria for how to pick $v_1,\dots,v_n$ such that we are guaranteed that $w$ is primitive. This is not obvious to me at all.
EDIT: Such a criteria is, exactly as @reuns suggested, being a primitive lattice, i.e. $L=\mathbb{Q}L\cap \mathbb{Z}^n$. In fact, much more general statements than the above holds. I was recommended the book Perfect Lattices in Euclidian Spaces by Martinet, and in Proposition $1.9.8$, he gives a more general version of our desired theorem.
Theorem. If $\Lambda$ is a unimodular lattice, and $M\leq \Lambda$ a sublattice satisfying $M=\mathbb{R}M\cap \Lambda$, then $$\det(M^\perp_\Lambda) = \det(M).$$
Taking $\Lambda=\mathbb{Z}^n$ and $M=L$, we get our desired theorem.