Conditional expectation under linear combination of equivalent measures

572 Views Asked by At

We start with $P_0\sim P_1$ probability measures on $(\Omega, \mathcal{F})$, and we define the new equivalent measure $P_\alpha=\alpha P_0+(1-\alpha) P_1$ with $\alpha\in (0,1)$.

Let $X$ be an integrable random variable mapping to $(\mathbb{R},\mathcal{B})$, and $\mathcal{G}\subset\mathcal{F}$ be a sub-$\sigma$-algebra. Does there exist a $\beta\in(0,1)$ for which $$\mathbb{E}_{P_\alpha}[X\mid\mathcal{G}]=\beta\mathbb{E}_{P_0}[X\mid \mathcal{G}]+(1-\beta)\mathbb{E}_{P_1}[X\mid\mathcal{G}]?$$

My guess would be $\beta=\alpha$, that would be nice, but I can't prove it. I'm about to present a finance related task that uses such an identity, it's not a homework for me.

I tried using this identity: $$\mathbb{E}_P(X\mid\mathcal{G})\mathbb{E}_Q\left(\frac{dP}{dQ}\mid\mathcal{G}\right)=\mathbb{E}_Q\left(\frac{dP}{dQ}X\mid\mathcal{G}\right),$$ but I only got $$\beta=1-(1-\alpha)\mathbb{E}_{P_\alpha}\left(\frac{dP_1}{dP_\alpha}\mid\mathcal{G}\right),$$ which would be fine for my application actually, but it's ugly, I'm afraid I made mistakes. It is equal to my guess iff $\mathbb{E}_{P_\alpha}\left(\frac{dP_1}{dP_\alpha}\mid\mathcal{G}\right)=1$, and I don't think that it's true in general, $\frac{dP_1}{dP_\alpha}$ being independent of $\mathcal{G}$ would imply that.

Any help would be appreciated, thanks!

2

There are 2 best solutions below

1
On BEST ANSWER

For the sake of notational coherence, let us rather define $P_a=aP_1+(1-a)P_0$ for every $a$ in $[0,1]$, and $E_a$ as the expectation with respect to $P_a$, then you are hypothetizing that, for some $a$ in $(0,a)$, there might exist some $b$ such that, for every integrable random variable $X$, $$E_a(X\mid\mathcal G)=bE_1(X\mid\mathcal G)+(1-b)E_0(X\mid\mathcal G)\tag{$\ast$}$$ To get a feeling of what $(\ast)$ entails, let us assume for simplicity that $\mathcal G$ is generated by some partition $(B_n)$ with $P_0(B_n)P_1(B_n)\ne0$ for every $n$, then by definition, $$E_a(X\mid\mathcal G)=\sum_nE_a(X\mid B_n)\mathbf 1_{B_n}$$ thus, $(\ast)$ would imply $$E_a(X\mid B_n)=bE_1(X\mid B_n)+(1-b)E_0(X\mid B_n)$$ for every $n$, that is, $$\frac{aE_1(X\mathbf 1_{B_n})+(1-a)E_0(X\mathbf 1_{B_n})}{P_a(B_n)}=b\frac{E_1(X\mathbf 1_{B_n})}{P_1(B_n)}+(1-b)\frac{E_0(X\mathbf 1_{B_n})}{P_0(B_n)}$$ This holds for every integrable $X$ if and only if $$\frac{a\mathbf 1_{B_n}P_1+(1-a)\mathbf 1_{B_n}P_0}{P_a(B_n)}=b\frac{\mathbf 1_{B_n}P_1}{P_1(B_n)}+(1-b)\frac{\mathbf 1_{B_n}P_0}{P_0(B_n)}$$ that is, on each $B_n$, $P_1$ and $P_0$ are proportional with $$\left(a-b\frac{P_a(B_n)}{P_1(B_n)}\right)P_1=\left((1-b)\frac{P_a(B_n)}{P_0(B_n)}-(1-a)\right)P_0$$ But, in the general case, this corresponds to the hypothesis that $$dP_1=Y\cdot dP_0$$ for some $\mathcal G$-measurable random variable $Y$ with $P_0(Y>0)=1$ since one assumed that $P_0\sim P_1$. Conversely, if this holds, then $E_0(X\mid\mathcal G)=E_1(X\mid\mathcal G)$ hence $(\ast)$ holds.

To sum up, the identity you are interested in does not hold in general but it does when the Radon-Nykodym derivative of $P_1$ with respect to $P_0$ is $\mathcal G$-measurable.

0
On

To finish the thought, Did's condition is exactly what I have in my task, i.e. $$E_0(X\mid\mathcal G)=E_1(X\mid\mathcal G),$$ because I have a (discrete) process $(X_n)_{n=0}^N$ that is a martingale under both $P_0$ and $P_1$, so substituting $X=X_n$ and $\mathcal G=\mathcal F_{n-1}$ we get $E_0(X_n\mid\mathcal F_{n-1})=X_{n-1}=E_1(X_n\mid\mathcal F_{n-1})$. And, as he said, in this case $(*)$ holds.

I show that $(*)$ holds here for completeness. We start with our probability measures, $$P_a=aP_0+(1-a)P_1$$ which, in other words means that $\forall A\in\mathcal F:E_a(I_A)=aE_0(I_A)+(1-a)E_1(I_A)$. Then $$E_a(Y)=aE_0(Y)+(1-a)E_1(Y)\tag{1}\label{1}$$ holds for all simple $Y$, thus, after taking limits, for all $\mathcal F$-measurable $Y\ge0$, and finally for general $Y$ random variables.

Moving to conditional expectations, we take a bounded, $\mathcal F_{n-1}$-measurable $Z$ random variable, so \begin{align} E_a(X_nZ) & \stackrel{\eqref{1}}= aE_0(X_nZ)+(1-a)E_1(X_nZ) \\ & = aE_0(E_0(X_n\mid\mathcal F_{n-1})Z)+(1-a)E_1(E_1(X_n\mid\mathcal F_{n-1})Z) \\ & = aE_0(X_{n-1}Z)+(1-a)E_1(X_{n-1}Z) \\ & \stackrel{\eqref{1}}= E_a(X_{n-1}Z), \end{align} in the second and the third row I used the definition of conditional expectation and the martingale property. So we got $$X_{n-1}=E_a(X_n\mid\mathcal F_{n-1})=E_0(X_n\mid\mathcal F_{n-1})=E_1(X_n\mid\mathcal F_{n-1}),$$ and hence $(*)$ obviously holds.