Question: Let $X$ be a random variable then is the statement $\mathbb{P}(B|\{X=i\})$ = 0.6(random number), equivalent of saying: $X = i \implies \mathbb{P}(B) = 0.6$?
Also say in the case A and B are conditionally independent if we condition on the event $\{ X = i \}$, which is $\mathbb{P}(A \cap B|\{ X = i \}) =\mathbb{P}(A|\{ X = i \}\mathbb{P}(B|\{ X = i\})$. Is it equivalent to: $X = i \implies \mathbb{P}(A\cap B) =\mathbb{P}(A)\mathbb{P}(B)?$
I don't know what conditional probability means from a measure theory perspective, but I was taught in undergraduate probably that $\mathbb{P}(A|B)$ where A and B are events is defined as $\frac{\mathbb{P}(A \cap B)}{\mathbb{P}(B)}$. However, treating it as such a definition is somewhat weird when I write proofs with regard to Markov chains, so can I just treat it as the first order logic implication as above?
If I can, I also wonder if it is possible to write it in this first order logic form for some events B, $\mathbb{P}(A|B)$, where B does not come from a preimage of what a random variable.
If you would have said something like:
"Based on the info that random variable $X$ has taken value $i$ we conclude that the probability of occurrence of event $B$ is now $0.6$"
then I would agree, so IMV the intuition on this is okay.
But I disagree with using a logical implication for that. Firstly its premisse is false (because $X$ is a function and $i$ is not) so that the implication is always true no matter what is implied.
If you repair this by taking something like $X(\omega)=i$ then still things are not okay because $\mathbb P(B)$ is a fixed real number not depending on $\omega$ or $i$.
Let's just do it without that implication. Our intuition does not really need it.