$P, Q$ probability measures. Show $P(B) \le P(A_c) \Rightarrow Q(B) \le Q(A_c) \: \: \: \: \forall B \in \mathcal A$ with Lebesgue decomposition.

57 Views Asked by At

Let $P, Q$ be two probability measures on $(\Omega, \mathcal A)$. Let $Q = Q_a +Q_s$ be the Lebesgue's decomposition of $Q$ with respect to P, so there exists a density $f$ with $Q_a(A) = \int_A f dP$. Furthermore $f = \infty$ $Q_s$-almost-everywhere. Define $A_c := \{w: f(w) \gt c\}$ for $c \gt 0$.
Now I have to show:$$P(B) \le P(A_c) \Rightarrow Q(B) \le Q(A_c) \: \: \: \: \forall B \in \mathcal A$$

I started with $$Q(A_c) -Q(B) = Q_a(A_c) + Q_s(A_c) -Q_a(B) - Q_s(B)$$ $$\ge cP(A_c)-\int f dP +Q_s(\{f \gt c\})-Q_s(B)$$

Then I say that the last two terms together are greater than zero since $Q_s(\{f \gt c\})$ is as big as $Q_s$ can get. But then I have $$Q(A_c) -Q(B) \ge cP(A_c)-\int f dP$$ and now I'm stuck.
Does anyone have any hints or ideas on how to show the inequality? Thanks in advance!