How many degrees of freedom are in a Bayes type scenario?

59 Views Asked by At

Given a real world condition (such as a medical condition) which may or may not be present, and an indicator (such as a medical test) predicting whether or not that condition is present, there are four unconditional probabilities to consider, $\Pr(C)$, $\Pr(N)$, $\Pr(+)$, and $\Pr(-)$, where $C$ is the condition being present, $N$ is no condition being present, $+$ is the indicator predicting that the condition is present, and $-$ is the indicator predicting that the condition is not present.

On top of these unconditional probabilities, there are eight "basic" conditional probabilities to consider, $\Pr(C\ |\ +)$, $\Pr(N\ |\ +)$, $\Pr(+\ |\ N)$, $\Pr(-\ |\ N)$, $\Pr(N\ |\ -)$, $\Pr(C\ |\ -)$, $\Pr(-\ |\ C)$, and $\Pr(+\ |\ C)$.

Considered in this order, a nice cyclical pattern emerges within the conditional probabilities, whereby each subsequent conditional probability can be found by alternating between Kolmagorov's Axiom of Normativity and Bayes' Theorem, with wrap-around from the end back to the beginning.

It is easy to see that there are two degrees of freedom among the four unconditional probabilities, with the remaining two unconditional probabilities being taken care of by Kolmagorov's Axiom of Normativity. It is also easy to see that the applications of Bayes' Theorem pull at least appear to pull one or both of the unconditional probabilities into the conditional probability cycle, potentially impacting the overall number of degrees of freedom. I also wonder whether the wrap-around in the in the conditional probability cycle removes a degree of freedom.

How many degrees of freedom exist in such a framework overall? How many probabilities must be given in order to solve for all of the rest, and does it matter which ones are given?

1

There are 1 best solutions below

0
On

Consider the four probabilities of intersections: $\Pr ( C \cap + ) , \Pr ( N \cap + ), \Pr ( C \cap - ), \Pr ( N \cap - )$. By Kolmogorov's Axiom of Normality, these probabilities have three degrees of freedom. You can find all eight conditional probabilities from these four probabilies by combining Kolmogorov's Axiom with Bayes' Theorem, for instance:

$\Pr (C | +) = \dfrac{ \Pr ( C \cap + )}{ \Pr( C \cap + ) + \Pr( N \cap + )}$

It follows that the conditional probabilities have three degrees of freedom.

As a example, if you know $\Pr(C|+)$ and $\Pr(C|I)$, you can calculate $\Pr(N|+), \Pr(N|-)$, and $\Pr(C), \Pr(N), \Pr(C \cap +), \Pr(C \cap -)$. You would still need to know another conditional probability before you can calculate them all. If you know, say $\Pr(+|N)$, you can find $\Pr(N \cap +) = \Pr(N) \Pr(+|N)$, and then $\Pr(N \cap -) = 1 - \Pr(N \cap +) - \Pr(C \cap +) - \Pr(C \cap -)$. From then on, you can find the remaining conditional probabilities.