Showing the every consistent set of sentences has a model

1.2k Views Asked by At

I want to do a short proof showing that every consistent set of sentences has a model. I am assuming the derivability version of completeness for first-order logic, in for form: $$\Sigma \models F \;\implies\; \Sigma \vdash F.$$

So, $\Sigma$ is consistent and satisfiable. I believe I need to look at a corresponding result for propositional logic, but unsure of where to go from here.

1

There are 1 best solutions below

2
On

If I correctly understand the question, we know that "If $F$ is a semantic consequence of $\Sigma$, then it is deducible form $\Sigma$" and we should prove that "If $\Sigma$ is consistent then it has a model." Given a consistent $\Sigma$, apply the given assumption with some contradiction as $F$, say $p\land\neg p$ for some $p$ of your choice. If this $F$ were deducible from $\Sigma$, then $\Sigma$ would be inconsistent (either immediately or after a short proof, depending on your definition of "consistent"). So, since $\Sigma$ is consistent, $F$ is not deducible from $\Sigma$ and therefore (by the given knowledge) not a semantical consequence of $\Sigma$. That means, by definition, that there is a model in which $\Sigma$ is true and $F$ is false. In particular, there is a model in which $\Sigma$ is true.