Show that this compound Poisson process is a Levy process

616 Views Asked by At

Consider a Poisson Process $(N_{t})_{t\geq 0}$ with parameter $\lambda > 0$ and a sequence of iid random variables $(U_{i})_{i \in \mathbb N}$ with distribution $Q$, now define a process $X_{t}:= \sum\limits_{k=1}^{N_{t}}U_{k}$ which we shall call the compound Poisson process.

Show that this compound Poisson process is a Levy process, i.e. Show that it satisfies the following four conditions:

$(i) \; X_{0}=0$

$(ii) \; \text{there exists a family of measures }(\mu_{t})_{t\geq 0}\text{ such that for }t>s\geq 0\ \; X_{t}-X_{s}\sim \mu_{t-s}$

$(iii)$ for $n \in \mathbb N$, $0=t_{0}< t_{1}<...<t_{n}$, we have $X_{t_{i+1} }-X_{t_{i}}\; \forall i =0,...,n-1$ are independent.

$(iv)$ $(X_{t})_{t\geq 0}$ has a.s. cadlag paths.

My attempt:

$(i):$ is trivial

$(ii):$ I have computed the characteristic functions, so for

$$ E[\exp(iuX_{t})]=E[(\int_{\mathbb R}\exp(iux)Q(dx))^{N_{t}}]=\sum\limits_{n \in \mathbb N_{0}}(\int_{\mathbb R}\exp(iux)Q(dx))^{n}P(N_{t}=n)\\=\exp(-\lambda t)\sum\limits_{n \in \mathbb N_{0}}(\int_{\mathbb R}\exp(iux)Q(dx))^{n}\frac{(\lambda t)^{n}}{n!}=\exp(\lambda t(\int_{\mathbb R}\exp(iux)Q(dx)-1))$$

Now by the characteristic function of $X_{t}$, following the same steps we can show that $X_{t+h}-X_{h}$ has the same characteristic function. And thus $X_{t+h}-X_{t}=X_{t}$ in distribution. Therefore this should suffice for (ii), although I would be rather interested to know what form $\mu_{t}$ takes on.

To show (iii), I do not know if there is a nice way to prove it.

For (iv): Consider a sequence from the right to $t \geq 0$ and call this sequence $(t_{n})_{n \in \mathbb N}$ since $N$ is a.s. cadlag, we have that $\lim\limits_{n \to \infty} X_{t_{n}}=\sum\limits_{k=1}^{N_{t}}U_{k}=X_{t}$ and from the left we obtain the same.

I would appreciate any help particularly on the points (ii) and (iii).

1

There are 1 best solutions below

0
On BEST ANSWER

Stationary and independent increments would follow if we can show, for any $n\in\mathbb{N}$, $\alpha_{1},\ldots ,\alpha_{n}$ and $0 = t_{0} < \cdots < t_{n}$, that

$$\mathbb{E}\left[ \exp\left(\sum_{j=1}^{n}\text{i}\langle \alpha_{j}, Y_{t_{j}}-Y_{t_{j-1}}\rangle \right) \right] = \prod_{j=1}^{n}\exp\left( -\lambda (t_{j}-t_{j-1})(1-\varphi (\alpha_{j})) \right)$$ where $\varphi_{j}$ is the characteristic function of $U_{j}$. The characteristic function of the process $N$ itself is $$\mathbb{E}\left[ e^{\text{i}\langle \alpha , N_{t}\rangle} \right] = \exp \left( -\lambda t(1-\varphi (\alpha)) \right)$$ as you also calculated.

Below I have shown the desired formula in the case $n=2$. I suppose there are no trouble doing the same for a general $n\in\mathbb{N}$, just with more notation. We have \begin{align} \mathbb{E}\left[ \exp(\text{i}(\langle \alpha_{1},X_{t_1} - X_{t_0}\rangle + \langle \alpha_{2},X_{t_2} - X_{t_1}\rangle) \mid N_{t_1} = n_{1}, N_{t_2} - N_{t_1} = n_{2}\right] &= \varphi (\alpha_{1})^{n_1}\varphi (\alpha_{2})^{n_2} \end{align} So we can consider \begin{align} \mathbb{E}\left[ \varphi (\alpha_2)^{N_{t_2}-N_{t_1}} \right] &= \sum_{j=0}^{\infty}\varphi (\alpha_{2})^{j}\mathbb{P}(N_{t_2} - N_{t_1} = j)\\ &= \sum_{j=0}^{\infty}\frac{\varphi(\alpha_2)^{j}\lambda(t_{2}-t_{1})^{j}}{j!}e^{-\lambda (t_{2} - t_{1})} \\ &= \exp(-\lambda (t_{2} - t_{1})(1-\varphi (\alpha_{2}))) \end{align} You can then do the same for $\varphi (\alpha_{1})^{N_{t_1}}$ and use independence of the increments of $N$ to obtain that \begin{align} \mathbb{E}\left[ \exp(\text{i}(\langle \alpha_{1},X_{t_1} - X_{t_0}\rangle + \langle \alpha_{2},X_{t_2} - X_{t_1}\rangle) \right] &= \mathbb{E} \left[ \varphi (\alpha_{1})^{N_{t_1}}\varphi(\alpha_2)^{N_{t_2}-N_{t_1}} \right] \\ &= \mathbb{E} \left[ \varphi (\alpha_{1})^{N_{t_1}}\right]\mathbb{E}\left[\varphi(\alpha_2)^{N_{t_2}-N_{t_1}} \right] \\ &= \exp(-\lambda t_{1}(1-\varphi (\alpha_{2})))\exp(-\lambda (t_{2} - t_{1})(1-\varphi (\alpha_{2}))) \end{align} which proves the desired result for $n=2$.