How do Regular Conditional Probabilities assign values to $P(A|B)$ when $P(B) = 0$?

302 Views Asked by At

Yeah, like in the title, how do Regular Conditional Probabilities assign values to $P(A|B)$ when $P(B) = 0$? Because I am not trained in advanced probability, please explain it as simply as possible. It'll be great if you can demonstrate with an example. Let $X$ be a random variable between $[0, 1]$. What is the $P(X=\frac{2}{3}|X=\frac{2}{3})$?

Also, can it also assign values to $P(A|B)$ in finite spaces? For example, imagine you throw a die, and you observe that the die landed on an odd number at 12 noon. So letting $O$ be the event that the die landed odd, $P(O)$ changes from $\frac{1}{2}$ to $1$ after noon. Hence, after noon, letting $E$ be the event that the die landed even, $P(E)$ changes to $0$. But even after 12 noon, I still intuitively want to say that $P(E|E)$ is still $1$. It seems to me that for any non-empty set $A$ in the sigma-algebra, $P(A|A) = 1$. Can regular conditional probabilities assign $P(E|E)$ a value of $1$? Is it possible? Or do I need to give that intuition up? Thanks for your help.

Edit: I think my question is different from this: Probability, conditional on a zero probability event

Because I'm asking about finite spaces too. Also, I'm asking for a demonstration of how regular conditional probabilities assign value to $P(X=\frac{2}{3}|X=\frac{2}{3})$.

Edit 2: Let the sample space in the dice example be {1, 2, 3, 4, 5, 6}.

Edit 3: The sample space for the dice example is {1, 2, 3, 4, 5, 6}. The initial probability function before noon is $P$. After noon, the probability function becomes $P'$. So $P(O)$ = $P(E)$ = 0.5. But $P'(O) = 1$ and $P'(E) = 0$. Still, I want to say that $P'(E|E) = 1$.

2

There are 2 best solutions below

3
On

The intuition behind $P(A|B)$ is "what is the probability of $A$ if I know that $B$ happened?"

When $P(B)=0$ the event $B$ can't happen, so the conditional probability of $A$ is undefined.

The same intuition explains why $P(E|E) = 1$.

In your example with a die and a clock you have to be careful to specify the sample space. There are really twelve events, not six: each possible value of the throw and the information about whether the time is before or after noon. If you make that explicit no probabilities "change".

Edit in response to comments (and downvotes).

The question asks about intuition, particularly for a finite sample space. When throwing a die, there are no nonempty events with probability $0$. That is still true when the throw is before or after noon (as a binary choice). A typical one of the $12$ atomic events is ($3$, before noon).

The situation is indeed subtler for continuous distributions. In the mathematical model a throw "exactly at noon" is a nonempty event with probability $0$. In the physical world no experiment can determine whether something happens exactly at noon, only approximately at noon up to some measurement uncertainty.

2
On

In general, for a single event $B$ of probability zero, there is no definition (not even "regular conditional probability") for $P(A|B)$.

However, when $X$ is a random variable, there is a definition for $P(A|X=t)$ that holds for "almost all $t$" even if $P(X=t)= 0$ or all $t$. In some cases, this is a continuous function of $t$, and then we can by convention use the "continuous version" to get $P(A|X=t)$ for all $t$.

More generally, when $\mathcal F$ is a sigma-algebra, there is a definition for $P(A|\mathcal F)$, which is a random variable and thus defined only up to sets of measure zero.

I say "more generally" because the case $P(A|X=t)$ is obtained when we use the sigma-agebra $\sigma(X)$ generated by $X$.