Name for logical algebra in which $A \lor B = A + B - AB$?

219 Views Asked by At

An answer from 5 years ago provides simple rules for converting logical propositions to ordinary elementary algebraic expressions:

$$\lnot{A} = 1 - A$$ $$A \land B = AB$$ $$A \lor B = A + B - AB$$

Where $A$ and $B$ can only take on values 1 for "TRUE" and 0 for "FALSE", so $A^2 = A$, $A^2B = AB$.

Is there a commonly used name for this set of rules? It is not Boolean Algebra because in Boolean Algebra the rules of elementary algebra do not apply. $A \lor B \equiv A + B; 1 + 1 = 1$. It is not exactly GF(2) (i.e., integer arithmetic modulo two) because I am using the "minus" sign.

I ask because I am writing a basic introduction to probability from a Jaynesian/Keynesian point of view and want to go straight from logic to probability without going through set theory a la Kolmogorov. I gather that the above rules are widely used in programming languages where GF(2) is not implemented.See the Wikipedia article. If there is no accepted name for this, I will just call it "converting logical propositions to ordinary algebra limited to 0/1 variables".

1

There are 1 best solutions below

0
On

After 1 month, I am answering my own question. I found a 1981 paper by Theodore Hailperin called "Boole's Algebra Isn't Boolean Algebra". I also read some of Hailperin's 1986 book, "Boole's Logic and Probability". However, I am hestitant to refer to the system that I use as "Boole's Algebra".

Norman Wildberger's YouTube channel has a series of videos distinguishing the "Algebra of Boole" from Boolean Algebra. Wildberger's "Algebra of Boole" is arithmetic modulo 2 in which "+" means XOR and 1 + 1 = 0. In modern Boolean Algebra, "+" means (inclusive) OR and 1 + 1 = 1. I prefer Wildberger's "Algebra of Boole" to Boolean Algebra, but the system that I am using is neither.

I am now following Blitztein and Hwang more closely and introducing an indicator function $I(\cdot)$ that converts logical statements into ordinary high school algebraic expressions on 0/1 variables. The 0/1 variable corresponding to proposition $A$ is written $I(A)$ or $I_A$. I gave up on allowing $A$ to represent either a logical proposition or an algebraic 0/1 variable.

Ordinary algebra allows the "-" (minus) sign and 1 + 1 = 2. Since the variables are limited to 0 and 1, and since $0^2 = 0$ and $1^2 = 1$, $I_A^2 = I_A$.

This system works fine and translates directly into probability rules. For example, here is how I present the XOR probability rule:

\begin{align*} (A \text{ XOR } B) &\equiv (A \cap \overline{B}) \cup (\overline{A} \cap B)\\ I(A \text{ XOR } B) &= I_A(1-I_B) + (1-I_A)I_B - I_A(1-I_B)(1-I_A)I_B\\ &= I_A-I_AI_B + I_B-I_AI_B - 0\\ &= I_A + I_B - 2I_AI_B\\ P(A \text{ XOR } B) &= P(A) + P(B) - 2P(A \cap B) \end{align*}

The last step uses $P(\cdot)$ to take indicator variables to probabilities. Like the expectation operator $E(\cdot)$, it only works on expressions that are linear in the indicator variable, so $P(I_AI_B) \neq P(A)P(B)$; instead $P(I_AI_B) = P(A \cap B)$. Blitzstein and Hwang do something similar and call it the "Fundamental Bridge". This avoids having the "+" sign mean two different things. In Boolean expressions, I use "$\cup$" and reserve the "+" sign to mean what we all learned in elementary school.