Multiple Conditioning on Event Probabilities

7.8k Views Asked by At

I am trying to understand what's wrong with the following logic related to "multiple conditioning." Why is the probability of [(A given B) given C] not the same as the probability of [A given (B and C)] ? I know it's not true, but only because numbers disagree. I am having a hard time parsing what's wrong with the logic.

3

There are 3 best solutions below

3
On

$\Pr(X | Y) = \dfrac{\Pr(X \cap Y)}{\Pr(Y)}$ so $\Pr((A|B)|C) = \dfrac{\Pr(A \cap B|C)}{\Pr(B|C)} =\dfrac{\frac{\Pr(A \cap B\cap C)}{\Pr(C)}}{\frac{\Pr(B\cap C)}{\Pr(C)}} = \dfrac{\Pr(A \cap B\cap C)}{\Pr(B\cap C)}=\Pr(A | B \cap C)$ so they are the same.

Here is a way to check your numbers. If you have

  • $\Pr(A \cap B \cap C) = d/s$
  • $\Pr(A \cap B \cap C^c) = e/s$
  • $\Pr(A \cap B^c \cap C) = f/s$
  • $\Pr(A \cap B^c \cap C^c) = g/s$
  • $\Pr(A^c \cap B \cap C) = h/s$
  • $\Pr(A^c \cap B \cap C^c) = i/s$
  • $\Pr(A^c \cap B^c \cap C) = j/s$
  • $\Pr(A^c \cap B^c \cap C^c) = k/s$

where $s=d+e+f+g+h+i+j+k$, then $\Pr(B \cap C) = \dfrac{d+h}{s}$ and $\Pr(A | B \cap C) = \dfrac{d}{d+h}$.

Given $C$, we just need to look at

  • $\Pr(A \cap B | C) = d/t$
  • $\Pr(A \cap B^c | C) = f/t$
  • $\Pr(A^c \cap B | C) = h/t$
  • $\Pr(A^c \cap B^c | C) = j/t$

where $t=d+f+h+j$ (and $\Pr(C)=t/s$), then $\Pr(B|C) = \dfrac{d+h}{t}$ and $\Pr((A|B)|C) = \dfrac{d}{d+h}$, the same as before.

3
On

There's no such thing as [A given B].

It is NOT "the probabilty of {A given B}".

Rather, it is "{the probability, given B} of A".

0
On

Peter Milne's paper "Bruno de Finetti and the Logic of Conditional Events" is an account of attempts to base probability theory upon conditional events, with special regard to de Finetti's subjectivistic "coherent betting behavior" approach using a three-valued logic. The latter approach results in a conditional event algebra with the property that both $$(A\mid B)\mid C \ \equiv \ A\mid (B \And C)$$ and $$A\mid(B\mid C) \equiv A\mid(B \And C)$$ where $A, B, C$ are ordinary propositions, implying that the respective probabilities must be equal. (This is contrary to your expectation that they should differ. See p. 218 of the cited paper, where what you've called "multiple conditioning" is called "iterated conditioning".)

NB: In de Finetti's approach ...

To introduce the notion of conditional probability means extending the definition of $P(X)$ from the field of ordinary events $X$ to the field of conditional events.

This, of course, differs from standard presentations of probability theory, in which there are no such objects as "conditional events".

De Finetti's ideas has been rediscovered and developed in the late 80' (Goodman I. R., Nguyen, H. T. and E. A, Walker Conditional Inference and Logic for Intelligent systems, North Holland 1991). More recently, A. Mura presented a modified version of de Finetti's logic, equipped with a semantics that fits and generalizes Adam's probabilistic logic (see Adams E. W., The Logic of Conditionals, Reidel, 1975; Mura A. 'Probability and the Logic of de Finetti's Trievents', in Galavotti M. C. (ed.) De Finetti Radical Probabilist, College Publications, 2008, pp. 201-42; Mura A. 'Towards a New Logic of Indicative Conditionals', L&PS – Logic and Philosophy of Science, 9, 2011, pp. 17-31).