Conditional Introduction Rule

3.3k Views Asked by At

In the derivation (the image below) the author shows that given the premise $\neg S \land \neg J$, the conclusion is $S \implies J$. All these deductive maneuver for concluding implications I find confusing. In this case you make an assumption that $\neg S \land \neg J$ (it implies that $S$ is negative), than you make an assumption that $S$ is positive (!). Then you ask: can $J$ be negative? You answer: no, there is a contradiction (which doesn't involve $J$), so $J$ must be positive. Then you conclude: if $S$ is positive, then $J$ is positive.

As I understand, if you assume $S$ and $\neg S$, you can prove that $X$ and $\neg X$ for any $X$: suppose we have $X$, look that it leads to a contradiction, we have $S$ and $\neg S$ at the same time, so we cannot have $X$, we must have $\neg X$. And by the same method we can show that $\neg X$also leads to a contradiction and we must have $X$.

Can you clarify this Conditional Introduction Rule (image below), which allows you to create assumptions that contradict to each other and from this conclude implications.

enter image description here

enter image description here

1

There are 1 best solutions below

3
On BEST ANSWER

You have three rules in the "play" of the proof :

Conditional Introduction ($\rightarrow$-I; see in the top chart : $>$I) : if we have a derivation of $\psi$ from $\varphi$, then we can derive $(φ → ψ)$

Negation Elimination ($\lnot$-E) : if we have derived $\lnot \lnot \varphi$ , then we can derive $\varphi$ (also called : Double Negation)

Negation Introduction ($\lnot$-I) : if from $\varphi$ we have derived a contradiction, then we can derive $\lnot \varphi$.

Using Natural Deduction, we may state the above rules as follows [see Ian Chiswell & Wilfrid Hodges, Mathematical Logic (2007), page 17 and page 24] :

($\rightarrow$-I) $$\frac { \frac { [\varphi] } \psi } {(\varphi \rightarrow \psi)}$$

($\lnot$-E) $$\frac { \lnot \lnot \varphi } \varphi$$

($\lnot$-I) $$\frac { \frac { [\varphi] } \bot } { \lnot \varphi }$$

where $\bot$ stay for a contradiction.

This is the proof :

(1) $\lnot S \land \lnot J$ --- premise

(2) $S$ --- assumption $A_1$

(3) $\lnot J$ --- assumption $A_2$

(4) $\lnot S$ --- from (1) by $\land$-E

(5) $\lnot \lnot J$ --- from (3) and the contradiction (2)-(4) by $\lnot$-I, "discharging" assumption $A_2$

(6) $J$ --- from (5) by $\lnot$-E

(7) $S \rightarrow J$ --- from (2) and (6) by $\rightarrow$-I, "discharging" assumption $A_1$ --- conclusion.

As you can see, it is not Conditional Intro which "manages" contradictions; as per comment above, it licenses us to conclude from a derivation of $\psi$ under the assumtpion $\varphi$, to the conclusion $\varphi \rightarrow \psi$, which is nor more "dependent" of assumption $\varphi$.

Formally, the rule says that :

from the derivation : $\Gamma \cup \{ \varphi \} \vdash \psi$, we may obtain a new derivation : $\Gamma \vdash \varphi \rightarrow \psi$.

Here $\Gamma$ is a set (possibly empty) of assumptions which stay unchanged, while the assumption $\varphi$ of the initial derivation has been "discharged" in the new derivation.

The contradiction emerges from the "joint assertion" of the initial premise $\lnot S \land \lnot J$ and the two (temporary) assumptions : $S$ and $\lnot J$.

The contradiction tell us that the three cannot be asserted simultaneously; the rule for the management of contradictions ($\lnot$-I) give us the "liberty" to "blame" one of the three (we have to choose one) asserting its contradictory.

Thus, we decide to "blame" $\lnot J$, and "suppress" it from the set of assumptions, deriving its contradictory : $J$.

Having done this, we use Conditional Intro with the remaining (temporary) assumption : $S$, and we conclude with : $S \rightarrow J$.

The premise $\lnot S \land \lnot J$ "stay unchanged"; it is the only member of the set $\Gamma$.

Thus the last step is exactly : from the derivation $\Gamma \cup \{ S \} \vdash J$, obtain the new derivation : $\Gamma \vdash S \rightarrow J$, with $\Gamma = \{ \lnot S \land \lnot J \}$.