Need help to write a proof regarding first order formula

75 Views Asked by At

I'm trying to write a proof for the following question: For any formula φ, given two assignments g and g′ which differ only in what they assign to variables not in φ, and any model M (of appropriate vocabulary) then we have that M, g |= φ iff M, g′ |= φ. Prove this.
Now it seems a long proof for me as there are six conditions for satisfaction of a formula. I began by writing like this:
the conditions for satisfying the formula $\phi$ for a model $M$ given below,
$ M,g\models R(\tau_1,...,\tau_n) \iff (I_F^g(\tau_1),...,I_F^g(\tau_n)) \in F(R) $
$ M,g\models \neg \phi \iff \text{ not } M,g \models \phi $
$ M,g\models \phi \land \psi \iff M,g \models \phi \text{ and } M,g\models \psi $
$ M,g\models \phi \lor \psi \iff M,g \models \phi \text{ or } M,g\models \psi $
$ M,g\models \exists x\phi \iff M,g_x\models \phi \text{ for some x-variant of } g $
$ M,g\models \forall x\phi \iff M,g_x\models \phi \text{ for all x-variant of } g $ \
We need to check each of these conditions to prove that the claim is true. For the first condition we have,
$ M,g\models R(\tau_1,...,\tau_n) \iff (I_F^g(\tau_1),...,I_F^g(\tau_n)) \in F(R) $
$ M,g'\models R(\tau_1,...,\tau_n) \iff (I_F^{g'}(\tau_1),...,I_F^{g'}(\tau_n)) \in F(R) $
It is given that the assignments $g$ and $g'$ only differ in variables that are not in $\phi$. Now if the term $\tau_1$ that has two free variables $x_1$ and $y_1$ for the assignments $g$ and $g'$ which are not present in $\phi$, then $I_F^g(\tau_1)=0$ and $I_F^{g'}(\tau_1)=0$. If there are $n$ different variables then all the valuations of $I_F^g$ and $I_F^{g'}$ will be zero and hence same valuation for both $g$ and $g'$. So the first condition holds.
From the second condition we have,
$ M,g\models \neg \phi \iff \text{ not } M,g \models \phi $
$ M,g'\models \neg \phi \iff \text{ not } M,g' \models \phi $
This tells us that $M$ does not satisfy $\phi$ for the assignments $g$ and $g'$. Now we can check a counterexample that if it is possible that $M$ does not satisfy $g$ but it satisfies $g'$. But all the variables in formula $\phi$ has same assignments for $g$ and $g'$. Hence if we take a term $\tau$ which is not in $\phi$, then if $M,g\models \neg R(\tau)$ it is also true that $M,g'\models \neg R(\tau)$. Thus the above statement holds.

For the third and fourth conditions, we have,
$ M,g\models \phi \land \psi \iff M,g \models \phi \text{ and } M,g\models \psi $
$ M,g\models \phi \lor \psi \iff M,g \models \phi \text{ or } M,g\models \psi $
$ M,g'\models \phi \land \psi \iff M,g' \models \phi \text{ and } M,g'\models \psi $
$ M,g'\models \phi \lor \psi \iff M,g' \models \phi \text{ or } M,g'\models \psi $
Now if we take two free variables $x_1$ and $y_1$ for the assignments $g$ and $g'$ which are in $\phi$ and $\psi$ respectfully, then these will remain free for any boolean operation of $\phi$ and $\psi$. Because from the inductive definition of free and bound variables we know that if an occurence of any variable is free in $\phi$ or $\psi$ then that same occurence is free in $(\phi \land \psi)$ and $(\phi \lor \psi)$. Which implies, $ M,g\models \phi \land \psi$ iff $M,g' \models \phi \land \psi $. Also $ M,g\models \phi \lor \psi$ iff $M,g' \models \phi \lor \psi $. Thus the above statement holds.

From the fifth and sixth conditions we have,
$ M,g\models \exists x\phi \iff M,g_x\models \phi \text{ for some x-variant of } g $
$ M,g\models \forall x\phi \iff M,g_x\models \phi \text{ for all x-variant of } g $
$ M,g'\models \exists x\phi \iff M,g'_x\models \phi \text{ for some x-variant of } g' $
$ M,g'\models \forall x\phi \iff M,g'_x\models \phi \text{ for all x-variant of } g' $ \

I know the question is very long. But am I going the right way? Please let me know if this is the correct way for trying to prove the claim. Because I've spent so many days in this proof but still not sure if this is the right way to do it.

1

There are 1 best solutions below

6
On BEST ANSWER

You're definitely on the right track. Here's a formal way to handle this.

First, recall the definition of "interpretation of a term". Given a term $t$, we define $g(t) \in M$ recursively by $g(x) = g(x)$ if $x$ is a variable [where the equality defines the left side, and where the right side is applying the function $g$ to $x$], and $g(f(t_1, \ldots, t_n)) = f^M(g(t_1), \ldots, g(t_n))$ for $f$ an $n$-ary function and $t_1, \ldots, t_n$ a term.

Lemma 0: Suppose that for all variables $x$ occuring in a term $t$, $g(x) = g'(x)$. Then $g(t) = g'(t)$.

Proof: proceed via structural induction on $t$.

We then prove

Lemma 1: Suppose $g(t_i) = g'(t_i)$ for $i = 1, 2$. Then $M, g \models t_1 = t_2$ if and only if $M, g' \models t_1 = t_2$.

and

Lemma 2: Suppose that $R$ is an $n$-ary relation symbol and that $t_1, \ldots, t_n$ are terms. Suppose for all $i$, $g(t_i) = g'(t_i)$. Then $M, g \models R(t_1, \ldots, t_n)$ iff $M, g' \models R(t_1, \ldots, t_n)$.

We then prove

Lemma 3: For all $\star \in \{\top, \bot\}$, $M, g \models \star$ iff $M, g' \models \star$.

Lemma 4: Suppose that for $i = 1, 2$, we have $M, g \models \phi_i$ iff $M, g' \models \phi_i$. Then for all $\star \in \{\land, \lor, \to\}$, we have $M, g \models \phi_1 \star \phi_2$ iff $M, g' \models \phi_1 \star \phi_2$.

using the definition of $\models$.

Noting that $\neg \phi$ is just an abbreviation for $\phi \to \bot$, we get that

Lemma 5: Suppose $M, g \models \phi$ iff $M, g' \models \phi$. Then $M, g \models \neg \phi$ iff $M, g' \models \neg \phi$.

Now, we come to the subtler bit. Here, $g[x \mapsto y]$ is the function which sends $x$ to $y$ but otherwise agrees with $g$.

Lemma 6: Suppose that for all $y \in M$, we have $M, g[x \mapsto y] \models \phi$ iff $M, g'[x \mapsto y] \models \phi$. Then for $\star \in \{\exists, \forall\}$, we have $M, g \models \star x \phi$ iff $M, g' \models \star x \phi$.

You've completed these pieces so far. In order to put them all together, we now state the following theorem:

Let $\phi$ be a formula. Suppose we have partial variable assignments $g, g'$. Suppose that for all $x$ occuring free in $\phi$, $g(x) = g'(x)$. Then $M, g \models \phi$ iff $M, g' \models \phi$.

Proof: we proceed by structural induction on $\phi$.

  1. $\phi$ is of the form $t_1 = t_2$. In this case, for all $i = 1, 2$, for all variables $x$ occuring in $t_i$, $g(x) = g'(x)$. Then $g(t_i) = g'(t_i)$ by Lemma 0. By Lemma 1, we have $M, g \models \phi$ iff $M, g' \models \phi$.
  2. $\phi$ is of the form $R(t_1, \ldots, t_n)$. For all $i$, we see that for all variables $x$ occuring in $t_i$, $g(x) = g'(x)$. Therefore, $g(t_i) = g'(t_i)$ by Lemma 0 for all $i$. Therefore, $M, g \models \phi$ iff $M, g' \models \phi$ by Lemma 2.
  3. $\phi$ is of the form $\top$ or $\bot$. Apply Lemma 3.
  4. $\phi$ is of the form $\psi_1 \star \psi_2$ for some $\star \in \{\land, \lor, \to\}$. Since $FreeVariables(\phi) = FreeVariables(\psi_1) \cup FreeVariables(\psi_2)$, we see that for $i = 1, 2$, for all $x$ occuring free in $\psi_i$, $g(x) = g'(x)$. Therefore, we see that $M, g \models \psi_i$ iff $M, g' \models \psi_i$ by the inductive hypothesis. Apply Lemma 4.
  5. $\phi$ is of the form $\neg \psi$. Then $FreeVariables(\phi) = FreeVariables(\psi)$, so we can apply the inductive hypothesis and Lemma 5.
  6. $\phi$ is of the form $\star x \psi$ for some $\star \in \{\exists, \forall\}$. Then $FreeVariables(\phi) = FreeVariables(\psi) \setminus \{x\}$, so $FreeVariables(\psi) \subseteq FreeVariables(\phi) \cup \{x\}$. Consider an arbitrary $y \in M$. Then $g[x \mapsto y]$ and $g'[x \mapsto y]$ agree on all elements of $FreeVariables(\phi) \cup \{x\}$, so by the inductive hypothesis, $M, g[x \mapsto y] \models \psi$ iff $M, g'[x \mapsto y] \models \psi$. Therefore, by Lemma 6, we have $M, g \models \star x \psi$ iff $M, g' \models \star x \psi$.

The proof is now complete. $\square$