Necessary and sufficient condition for random sum of independent RVs to be a martingale

182 Views Asked by At

Let $M$ be a Poisson random measure on $(0,\infty)$ with intensity $\lambda dt$, where $\lambda\in(0,\infty)$. Let $(Y_n)_{n\in\mathbb{N}}$ be a sequence of independent random variables, independent of $M$ and distributed uniformly on $[0,1]$. Given a measurable function $g$ on $[0,1]$, define $$X_t=X_t^g=\sum_{n=1}^{N_t}g(Y_n)$$ where $N_t=M(0,t]$.

I have shown, using partitioning of the expectation and Fubini, that in the case $g\geq0$ we have $$\mathbb{E}(X_t)=\lambda t\int_0^1g(y)dy.$$ I am now tasked to find a necessary and sufficient condition on $g$ for $(X_t)_{t\geq0}$ to be a martingale.

My thinking was that, according to the OST, a necessary and sufficient condition is $\mathbb{E}(X_T)=\mathbb{E}(X_0)=0$ for all bounded stopping times $T$. Suppose that $g\geq0$. Then from the formula above, we see that for any $t>0$, we require that $g\equiv0$ on $[0,1]$ otherwise $\mathbb{E}(X_t)>0$. So I feel that it will be necessary for $g$ to be non-positive. I do not know how to proceed from here though, so would greatly appreciate advice.

1

There are 1 best solutions below

0
On BEST ANSWER

Let $\mathcal{F}_t$ be the natural filtration of the process $\left(X_t\right)_{t\geq 0}$.

Claim. $\left(X_t\right)_{t\geq 0}$ is a martingale w.r.t. $\mathcal{F}_t$ if and only if $\mathbb{E}\left[g(Y)\right]=0$.

Proof. For any $s<t$, we have $\mathbb{E}\left[X_t\left|\mathcal{F}_s\right.\right]=\mathbb{E}\left[\sum\limits_{n=1}^{N_t}g(Y_n)\left|\mathcal{F}_s\right.\right]=\mathbb{E}\left[\sum\limits_{n=1}^{N_s}g(Y_n)\left|\mathcal{F}_s\right.\right]+\mathbb{E}\left[\sum\limits_{n>N_s}^{N_s+M\left.\left[s,t\right.\right)}g(Y_n)\left|\mathcal{F}_s\right.\right]=X_s+\mathbb{E}\left[\sum\limits_{n>N_s}^{N_s+M\left.\left[s,t\right.\right)}g(Y_n)\right]$,

and

$\mathbb{E}\left[\sum\limits_{n>N_s}^{N_s+M\left.\left[s,t\right.\right)}g(Y_n)\right]=\mathbb{E}\left[\sum\limits_{n=1}^{N_{t-s}}g(Y_n)\right]=\mathbb{E}\left[\sum\limits_{n=1}^{N_{t-s}}g(Y_{n+N_{t-s}})\right]=\mathbb{E}\left[\mathbb{E}\left[\sum\limits_{n=1}^{N_{t-s}}g(Y_{n+N_{t-s}})\left|\mathcal{F}_{t-s}\right.\right]\right]=\mathbb{E}\left[\sum\limits_{n=1}^{N_{t-s}}\mathbb{E}\left[g(Y_{n+N_{t-s}})\left|\mathcal{F}_{t-s}\right.\right]\right]=\mathbb{E}\left[\sum\limits_{n=1}^{N_{t-s}}\mathbb{E}\left[g(Y)\right]\right]=\lambda(t-s)\mathbb{E}\left[g(Y)\right]$,

which is zero if and only if $\mathbb{E}\left[g(Y)\right]=0$.

If $g\geq 0$ then, $\mathbb{E}\left[g(Y)\right]=0\Rightarrow g=0$ almost everywhere in $\left[0,1\right]$.