A model for a theory supporting AND and OR?

181 Views Asked by At

Is the following statement true or false?

"There exists a finite set $S$ closed under the binary operations $*,+$ such that ... etc:" \begin{equation*} \begin{aligned}{} (\exists S, +, *) & & &\\ & |S| \in \mathbb{N} & & &\text{(the set $S$ is finite)} \\ & (\forall x,y,z \in S) \\ && x+y &\in S &\text{($S$ is closed under $+$)}\\ && x*y &\in S &\text{($S$ is closed under $*$)}\\ && x+y&=y+x &\text{(commutativity of $+$)} \\ && (x+y)+z&=x+(y+z) &\text{(associativity of $+$)}\\ && (x*y)*z&=x*(y*z) &\text{(associativity of $*$)}\\ && z*(x+y)&=(z*x)+(z*y) &\text{($*$ is right-distributive over $+$)}\\ && (x+y)*z&=(x*z)+(y*z) &\text{($*$ is left-distributive over $+$)}\\ & (\exists T,F,\alpha_1,\beta_1,\gamma_1,&\\ & \alpha_2,\beta_2,\gamma_2 \in S)\\ && T &\neq F &\text{(distinct $T,F$)}\\ && \alpha_1*F+&\beta_1*F+\gamma_1=F&\text{( $F \land F = F$)}\\ && \alpha_1*F+&\beta_1*T+\gamma_1=F&\text{( $F \land T = F$)}\\ && \alpha_1*T+&\beta_1*F+\gamma_1=F&\text{( $T \land F = F$)}\\ && \alpha_1*T+&\beta_1*T+\gamma_1=T&\text{( $T \land T = T$)}\\ && \alpha_2*F+&\beta_2*F+\gamma_2=F&\text{( $F \lor F = F$)}\\ && \alpha_2*F+&\beta_2*T+\gamma_2=T&\text{( $F \lor T = T$)}\\ && \alpha_2*T+&\beta_2*F+\gamma_2=T&\text{( $T \lor F = T$)}\\ && \alpha_2*T+&\beta_2*T+\gamma_2=T&\text{( $T \lor T = T$)}\\ \end{aligned} \end{equation*}

I make no other assumptions like neutral elements or inverses... I also do not assume $\alpha$'s, $\beta$'s, $\gamma$'s, $T$ and $F$ to be necessarily distinct, with the exception that $T$ and $F$ must be distinct!

Trying to find and example:

I have used the alg software package (http://math.andrej.com/2011/01/22/alg/) before to find simpler structures, but I fail to find an example of this structure (keeps running). Can someone suggest perhaps an optimized algorithm for searching a model/example?

If I omit either the 4 $\land$ or the 4 $\lor$ conditions then models are immediately found. But the combination remains elusive.

Trying to find a contradiction, to prove no such model exists

I can prove that $F+F=F+T$, by adding together the 2nd and 3rd rules of the 4 $\land$ rules as a left-hand-side and the 1st and 4th as right-hand-side.

Similarily for the $\lor$ rules I can prove $T+T=F+T$.

Together I know that $F+F=F+T=T+T$ but I fail to prove a contradiction...

Can you prove the original statement false (and hence no such structure could exist) or alternatively provide an example/model for this theory, proving that such a structure does exist?

EDIT:

I contacted Andrej Bauer from the link above, seeking help with alg usage, to optimize the search. He was kind enough to not only restructure the $\land$ and $\lor$ equations in the *.th file for more efficient searching, but also kind enough to post a rephrased version of my question on the research-level mathoverflow exchange. However in our email discussion we were focused more on the affine relations and I failed to emphasize and highlight my $*$ is not required to be commutative. However Keith Kearnes found an elegant proof that such a structure can not exist if $*$ is commutative. So my original question (this page) remains unsolved, but we do know that if such a structure exists its $*$ can not be commutative.

EDIT 2:

We can also conclude $\alpha_1\neq\beta_1$ since if we assume they were equal then all left-hand-sides of the $\land$ equations would equal $\alpha_1*(F+T)+\gamma_1$ contradicting the requirement that their right hand side differ for the 1st and 4th $\land$ rules. Similarly $\alpha_2\neq\beta_2$.

1

There are 1 best solutions below

3
On

The answer is: there is no finite set $S$ with these properties, but there is an infinite set $S$ with the properties. I have posted the explanation on mathoverflow.


Edit 6/27/17.

I will edit my answer on this page to respond to questions below about the my original, longer answer here.

  1. I understand that considering increasing powers of $\alpha$ at after a number of values the powers must end up in a repeating cycle. But given x initial powers and repetition length y, I fail to deduce that for some $k$: $\alpha^{2k}=\alpha^k$.

Suppose that $\alpha^r = \alpha^{r+s}$ for some $s>0$. By multiplying this equality by $\alpha^{r'-r}$ for $r'-r\geq 0$ we get $\alpha^{r'}=\alpha^{r'+s}$ for any $r'\geq r$. Also, by traversing the final cycle multiple times we get $\alpha^{r'}=\alpha^{r'+s}=\alpha^{r'+2s} = \cdots$, or just $\alpha^{r'}=\alpha^{r'+s'}$ for any $r'\geq r$ and any multiple $s'$ of $s$.

Using this information, let $k=r'=s'$ be some multiple of $s$ that is at least as large as $r$; i.e. $k=r'=s'=ms\geq r$ for some positive $m$. Then $\alpha^{2k}=\alpha^{r'+s'}=\alpha^{r'}=\alpha^k$.

  1. Another point is that my original question here does not assume identity element, which is assumed in the commutativity part.

There are at least two reponses to this objection: (i) an identity element can be adjoined if necessary, provided we are working with the standard definition of "polynomial", or (ii) the solution can be slightly modified to avoid any reference to an identity element. I'll explain (ii) here. First, I copy the relevant part of the solution:

Let $\overline{\mu}(x,y,z)=x + \alpha\beta y + \alpha\gamma z + D$. That is, delete the coefficient $\alpha$ from $x$ in the polynomial expression for $\mu(x,y,z)$ (or think of it as replacing $\alpha$ with $1$). Observe that the semimodule polynomials $\mu (x,y,z)$ and $\overline{\mu}(x,y,z)$ both have the same restriction to $\{T',F'\} = \{\alpha T, \alpha F\}$, since the polynomials only differ in their $x$-coefficient, the inputs all have $\alpha$ as a prefix, and $\alpha^2 = \alpha$. For example, $$ \begin{array}{rl} \overline{\mu}(F',T',T')&= \overline{\mu}(\alpha F,\alpha T,\alpha T)\\ &=(\alpha F) + \alpha\beta (\alpha T) + \alpha\gamma (\alpha T) + D\\ &= \alpha(\alpha F) + \alpha\beta (\alpha T) + \alpha\gamma (\alpha T) + D\\ &= \mu(\alpha F,\alpha T,\alpha T)\\ &= \mu(F',T',T'). \end{array}$$


The above can be rewritten as follows:

Let $\overline{\mu}(x,y,z)=\alpha x + \alpha\beta\alpha y + \alpha\gamma\alpha z + D$. That is, right multiply the coefficients of $y$ and $z$ by $\alpha$ in the polynomial expression for $\mu(x,y,z)$. Observe that the semimodule polynomials $\mu (x,y,z)$ and $\overline{\mu}(x,y,z)$ both have the same restriction to $\{T',F'\} = \{\alpha T, \alpha F\}$, since the polynomials only differ in their $y$ and $z$-coefficient, the inputs all have $\alpha$ as a prefix, and $\alpha^2 = \alpha$. For example, $$ \begin{array}{rl} \overline{\mu}(F',T',T')&= \overline{\mu}(\alpha F,\alpha T,\alpha T)\\ &=\alpha(\alpha F) + \alpha\beta \alpha(\alpha T) + \alpha\gamma \alpha(\alpha T) + D\\ &=\alpha(\alpha F) + \alpha\beta (\alpha^2 T) + \alpha\gamma (\alpha^2 T) + D\\ &= \alpha(\alpha F) + \alpha\beta (\alpha T) + \alpha\gamma (\alpha T) + D\\ &= \mu(\alpha F,\alpha T,\alpha T)\\ &= \mu(F',T',T'). \end{array}$$


The key point is that, in either version of the argument, the coefficient of $x$ in $\overline{\mu}(x,y,z) = \alpha x + \alpha\beta\alpha y + \alpha\gamma\alpha z + D$, which is $\alpha$, commutes with the coefficients of $y$ and $z$, since $\alpha(\alpha\beta\alpha) = \alpha\beta\alpha = (\alpha\beta\alpha)\alpha$ and $\alpha(\alpha\gamma\alpha) = \alpha\gamma\alpha = (\alpha\gamma\alpha)\alpha$.