Computational complexity of expanding brackets with variables $x_i \in \{0,1\}$

92 Views Asked by At

Consider integer coefficient polynomials in $x_i \in \{0,1\}$

E.g. $(x_1+1)(x_1x_2+x_2+1) \tag{1}$

Since $x_i \in \{0,1\}$ we have that ${x_i}^p=x_i$, then $(1)$ expands to

$$3x_1x_2 + x_1 + x_2 +1 $$

Is there a name for this type of algebra? It reminds me of $\mathbb{F}_2$ but with standard integer coefficients.

Given that the largest $i$ is $n$, the maximum number of terms in the expansion is $2^{n}$.

Does this mean there is a $O(n)$ algorithm for expanding the brackets? What if we were to product more of these polynomials together. How does the computational complexity change?

Thanks in advance for any help / advice. Ben

1

There are 1 best solutions below

3
On BEST ANSWER

Is there a name for this type of algebra?

If each $x_i$ is either zero or one then we have $x_i(x_i - 1) = 0$, so I guess you can work over the ring

$$R := \mathbb{Z}[x_1, \ldots, x_n]/\left(x_1^2 - x_1,\; \ldots, \; x_n^2 - x_n\right)$$

I don't know if there's a specific name for this ring, but an element such that $x^2 = x$ is called an idempotent.

Given that the largest $i$ is $n$, the maximum number of terms in the expansion is $2^n$. Does this mean there is a $O(n)$ algorithm for expanding the brackets?

No. For instance, assume we're computing a product $f g$ where $f$ is some general element of $R$ and $g = 1$. Then multiplying $f$ and $g$ just amounts to copying $f$, but copying $f$ requires $O(2^n)$ time.