Mechanizing Herbrand’s theorem

108 Views Asked by At

I study automated theorem proving using this book.

The author describes mechanizing Herbrand’s theorem by Gilmore procedure: Skolemize negated FOL formula and then tests Herbrand models for satisfiability. Basically transforms FOL proving to Boolean satisfiability problem.

There is a simple example (Drinker paradox) that I can not understand:

  1. FOL formula: $\exists x \forall y$ such that $P(x) \implies P(y)$
  2. Negated and Skolemized: $\forall x$ such that $P(x) \wedge \neg P(f_y(x))$
  3. The first ground instance to be generated is: $P(c) \wedge \neg P(f_y(c))$
  4. Since this is still propositionally satisfiable, a second instance is generated: $P(f_y(c)) \wedge \neg P(f_y(f_y(c)))$
  5. Since the conjunction of these two instances is propositionally unsatisfiable (the conjunction includes both $P(f_y(c))$ and its negation), the procedure terminates, indicating that two ground instances were used and that the formula is valid as claimed.

Why 4. is UnSAT if 3. is SAT?
As I understand: $c \neq f_y(c)$ then $P(c) \wedge \neg P(f_y(c))$ same as propositional formula: $a \wedge \neg b$ (satisified by a=1 and b=0). Following this logic: $f_y(c) \neq f_y(f_y(c))$ then $P(f_y(c)) \wedge \neg P(f_y(f_y(c)))$ same as $c2 \wedge \neg c3$ and should be SAT.
Can you please explain where is mistake in my understanding?

1

There are 1 best solutions below

3
On BEST ANSWER

The negation of $\exists x\, \forall y\, (P(x)\rightarrow P(y))$ is $\forall x\, \exists y\, (P(x)\land \lnot P(y))$.

Skolemizing gives $\forall x\, (P(x)\land \lnot P(f_y(x))$.

In particular, for a constant $c$, we have $P(c)\land \lnot P(f_y(c))$, and we also have $P(f_y(c))\land \lnot P(f_y(f_y(c))$. These are unsatisifable, because the first implies $\lnot P(f_y(c))$, while the second implies $P(f_y(c))$.

The conclusion is that $\forall x\, \exists y\, (P(x)\land \lnot P(y))$ is not satisfiable, so $\exists x\, \forall y\, (P(x)\rightarrow P(y))$ is valid.