The Cayley–Dickson construction (see refs below) is a way of generating 'algebras' (in the loose sense) of increasing size over the reals, obtaining a sequence of algebras $\mathbb R = R_0 \subset R_1 \subset R_2 \subset \cdots \subset R_n \subset \cdots$ . The algebra $R_n$ is a vector space over $\mathbb R$ of dimension $2^n$, and can (for $n \geqslant 1$) be obtained from $R_{n-1}$ by a formal extension of some imaginary unit $\def\u{\mathbf u}\u_n$. Thus we may describe elements of $R_n$ as a real-linear combination of products of the form $$ \u_a \u_b \u_c \cdots \u_k := \bigl[\cdots\bigl[\bigl[\u_a \u_b\bigr] \u_c\bigr] \cdots \bigr] \u_k\;, \qquad\qquad(*)$$ that is, multiplication is left-associative (performed left-to-right) unless otherwise specified.
Let us call products of the form in $(*)$ imaginary units, together with their negations. (For instance, the imaginary units in $\mathbb C$ are $\pm \mathrm{i}$, and in the quaternions are $\pm\mathrm{i}, \pm\mathrm{j}, \pm\mathrm{k}$.) We can in fact represent each imaginary unit (and with arbitrary bracketing) as $\pm 1$ times a left-associative product where $1 \leqslant a < b < c < \cdots < k \leqslant n$. This is a consequence of the rules for multiplying them, which however tend only to be presented in a recursive formulation which is not exactly transparent when iterated.
Question. Is there a closed-form expression for multiplication of imaginary units, which holds for all Cayley–Dickson algebras?
[Update] I have posted an answer which would suffice for my purposes. I will provide awards to any answers (with reference or proof) which gives a substantially simpler description, and I will accept the answer with the simplest such description. I will leave the question open until it appears that no better answers are forthcoming.
References:

I have determined that there is an expression of sorts, not closed-form but evaluatable as a reasonably simple straight-line program, which may be discovered essentially from a merge-sort of the contributions $\mathbf u_j$ of two factors of the form $(*)$ in the original question.
We use the fact that, for $a,b \in R_{n-1}$, we have $(\mathbf u_n^\ast \;\! a)(b \;\! \mathbf u_n) = (ab)^\ast$. This implies in particular that if $a$ and $b$ are imaginary units: $$\begin{aligned} (a \mathbf u_n)^\ast (b) &= -(a b^\ast) \mathbf u_n = -(a^\ast b) \mathbf u_n \,, \\ (a \mathbf u_n)^\ast (b \mathbf u_n) &= (a^\ast b)^\ast = \pm(a^\ast b) &\text{[with sign $-1$ provided $a \ne \pm b$]}, \\ (a)^\ast (b\mathbf u_n) &= (a^\ast b)^\ast \mathbf u_n = \pm(a^\ast b) \mathbf u_n &\text{[with sign $-1$ provided $a \ne \pm b$]}. \\ \end{aligned}$$ For a boolean string $x \in \{0,1\}^n$, let $$\mathbf e_x := \mathbf u_1^{x_1} \mathbf u_2^{x_2} \cdots \mathbf u_n^{x_n} \in R_n$$ using the left-associative product above. It is not difficult to show that $\mathbf e_x^{-1} = \mathbf e_x^\ast = -\mathbf e_x$ for any $x \ne 00\cdots0$. We may then use the reductions above to show that $$ \mathbf e_x^\ast \mathbf e_y = (-1)^{N_{x,y}} \mathbf e_{x \oplus y}$$ where $x \oplus y$ is the bit-wise XOR of $x$ and $y$ (equivalently, the reduction modulo $2$ of the vector sum of $x$ and $y$ as row-vectors), and where $N_{x,y}$ counts (mod $2$) the number of flips in sign in the recursive reduction of the multiplication. We may give $N_{x,y}$ by $$\begin{aligned}[b] N_{x,y} &= \sum_{h = 1}^n \Bigl[ x_h \bigl(x_h \oplus y_h\bigr) \,+\, x_h J^{(x \oplus y)}_h +\, \bigl(x_h \oplus y_h\bigr) J^{(x)}_h J^{(y)}_h J^{(x \oplus y)}_h \Bigr] \\&\equiv \sum_{h = 1}^n \Bigl[ x_h \bigl(x_h \oplus y_h\bigr) \bigl(1 - J^{(x \oplus y)}_h\bigr) + x_h y_h J^{(x \oplus y)}_h + \bigl(x_h \oplus y_h\bigr) J^{(x)}_h J^{(y)}_h J^{(x \oplus y)}_h \Bigr] \pmod2\,, \end{aligned}$$ where for any string $z \in \{0,1\}^n$ we define a string $J^{(z)} \in \{0,1\}^n$ such that $$ J^{(z)}_h = 1 \iff \exists j: (1 \leqslant j < h) \mathbin\& (z_j = 1) $$ or equivalently $J^{(z)}_1 = 0$, and $J^{(z)}_{h+1} = z_h + J^{(z)}_h - z_h J^{(z)}_h$ for $1 \leqslant h < n$. [Edited to add: the formula for $N_{x,y}$ above is a correction from the original answer, which was based on an incorrect case analysis on my part. The formula for the sign given is most easily verified using the second line, by case analysis on the expansion of $\mathbf e_x$ and $\mathbf e_y$ as left-associative products of the units $\mathbf u_j\,$.]