Convergence of Random Vector to Gaussian Process in Distribution

629 Views Asked by At

I am trying to show the following:

Let $\{N_{c}(t)\}$ be a poisson process with parameter $c \ge 0$. Define the process $X_{c}(t)=N_{c}(t)-ct$ for $t \ge 0.$

Show for every $k \in \mathbb{N}$ and $(t_1,...,t_k) \in \mathbb{R}^+$ the random vector $\frac{1}{c^{1/2}}(X_{c}(t_1),...,X_{c}(t_k))$ converges in distribution to $(Y(t_1),...,Y(t_k))$ where $\{Y(t)\}$ is a Gaussian process. Describe this process.

My thoughts are this can be shown via characteristic functions similarly to how the CLT is proven, that is $\phi_{\frac{1}{c^{1/2}}N_{c}-ct}(t)$ converges to a characteristic function of a multivariate normal for each $t$ and $k$. This would be sufficient to show the resulting limit is a Gaussian process. I am struggling to actually show this algebraically.

As far as describing this limit, is it interpreted similarly to the CLT? Since the poisson process is a counting process, does the limit Gaussian process imply that the arithmetic mean of poisson process after sufficiently many time steps will follow a multivariate normal distribution?

2

There are 2 best solutions below

0
On BEST ANSWER

WLOG we can always label the $t$ such that $t_1 < t_2 < \ldots < t_k$

The joint moment generating function of $c^{-1/2}(X_c(t_1), X_c(t_2), \ldots, X_c(t_k))$ is

$$ \begin{align} &M_{c^{-1/2}(X_c(t_1), X_c(t_2), \ldots, X_c(t_k))}(s_1, s_2, \ldots, s_k) \\ =&~ E\left[\exp\left\{\frac {1} {\sqrt{c}} \sum_{j=1}^k s_jX_c(t_j)\right\}\right] \\ =&~ E\left[\exp\left\{\frac {1} {\sqrt{c}} \sum_{j=1}^k s_j(N_c(t_j) - ct_j)\right\}\right] \\ =&~ \exp\left\{-\sqrt{c}\sum_{j=1}^k s_jt_j\right\}E\left[\exp\left\{\frac {1} {\sqrt{c}} \sum_{j=1}^k \sum_{l=1}^j s_{k-l+1} [N_c(t_{k-j+1}) - N_c(t_{k-j})]\right\}\right] \\ =&~ \exp\left\{-\sqrt{c}\sum_{j=1}^k s_jt_j\right\}\prod_{j=1}^k E\left[\exp\left\{\frac {1} {\sqrt{c}} \sum_{l=1}^j s_{k-l+1} [N_c(t_{k-j+1}) - N_c(t_{k-j})]\right\}\right] \\ =&~ \exp\left\{-\sqrt{c}\sum_{j=1}^k s_jt_j\right\}\prod_{j=1}^k \exp\left\{c(t_{k-j+1}-t_{k-j}) \left(\exp\left\{\frac {1} {\sqrt{c}} \sum_{l=1}^j s_{k-l+1} \right\} - 1\right)\right\} \\ =&~ \exp\left\{-\sqrt{c}\sum_{j=1}^k s_jt_j + c\sum_{j=1}^k (t_{k-j+1}-t_{k-j}) \sum_{m=1}^{\infty} \frac {1} {m!}\left(\frac {1} {\sqrt{c}} \sum_{l=1}^j s_{k-l+1}\right)^m \right\}\\ \end{align} $$ where $t_0 = 0$ is introduced in the 4th line. Note that

$$ \begin{align} &~\sum_{j=1}^k (t_{k-j+1}-t_{k-j}) \sum_{l=1}^j s_{k-l+1} \\ =&~ \sum_{j=1}^k t_{k-j+1}\sum_{l=1}^j s_{k-l+1} - \sum_{j=1}^kt_{k-j} \sum_{l=1}^j s_{k-l+1} \\ =&~ \sum_{j=0}^{k-1} t_{k-j}\sum_{l=1}^{j+1} s_{k-l+1} - \sum_{j=1}^k t_{k-j} \sum_{l=1}^j s_{k-l+1} \\ =&~ t_ks_k + \sum_{j=1}^{k-1} t_{k-j}s_{k-j} - t_0\sum_{l=1}^k s_{k-l+1} \\ =&~ \sum_{j=1}^{k} t_{j}s_{j} \end{align}$$

So we have $$\begin{align} &~ \exp\left\{-\sqrt{c}\sum_{j=1}^k s_jt_j + c\sum_{j=1}^k (t_{k-j+1}-t_{k-j}) \sum_{m=1}^{\infty} \frac {1} {m!}\left(\frac {1} {\sqrt{c}} \sum_{l=1}^j s_{k-l+1}\right)^m \right\}\\ =&~ \exp\Bigg\{-\sqrt{c}\sum_{j=1}^k s_jt_j + \sqrt{c}\sum_{j=1}^k (t_{k-j+1}-t_{k-j}) \sum_{l=1}^j s_{k-l+1} \\ &~ + \frac {1} {2} \sum_{j=1}^k (t_{k-j+1}-t_{k-j}) \left(\sum_{l=1}^j s_{k-l+1}\right)^2 + \sum_{j=1}^k (t_{k-j+1}-t_{k-j}) \sum_{m=3}^{\infty} \frac {1} {c^{m/2-1}} \frac {1} {m!}\left(\sum_{l=1}^j s_{k-l+1}\right)^m \Bigg\}\\ \end{align}$$ The first two terms cancelled out as shown, and the last term vanish as $c \to \infty$. Therefore, $$ \lim_{c\to\infty}M_{c^{-1/2}(X_c(t_1), X_c(t_2), \ldots, X_c(t_k))} (s_1, s_2, \ldots, s_k) = \exp\left\{\frac {1} {2} \sum_{j=1}^k (t_{k-j+1}-t_{k-j}) \left(\sum_{l=1}^j s_{k-l+1}\right)^2\right\}$$

Using similar tricks, $$ \begin{align} &~ \sum_{j=1}^k (t_{k-j+1}-t_{k-j}) \left(\sum_{l=1}^j s_{k-l+1}\right)^2 \\ =&~ \sum_{j=0}^{k-1} t_{k-j}\left(\sum_{l=1}^{j+1} s_{k-l+1}\right)^2 - \sum_{j=1}^kt_{k-j} \left(\sum_{l=1}^j s_{k-l+1}\right)^2 \\ =&~ t_ks_k^2 + \sum_{j=1}^{k-1} t_{k-j}\left(2\sum_{l=1}^{j} s_{k-l+1} + s_{k-j}\right)s_{k-j} - t_0 \left(\sum_{l=1}^k s_{k-l+1}\right)^2 \\ =&~ \sum_{j=1}^k t_js_j^2 + 2 \sum_{j=1}^{k-1} \sum_{l=1}^{j} t_{k-j}s_{k-j}s_{k-l+1} \\ =&~ \sum_{j=1}^k t_js_j^2 + 2 \sum_{j=1}^{k-1} \sum_{l=j+1}^{k} t_{j}s_{j}s_{l} \\ \end{align}$$

On the other hand, $$ \begin{align} &~ \sum_{j=1}^k\sum_{l=1}^k s_js_l\min\{t_j, t_l\} \\ =&~ \sum_{j=1}^k t_js_j^2 + \sum_{j=2}^k\sum_{l=1}^{j-1} s_js_lt_l + \sum_{j=1}^{k-1} \sum_{l=j+1}^k s_js_lt_j \\ =&~ \sum_{j=1}^k t_js_j^2 + \sum_{l=1}^{k-1}\sum_{j=l+1}^{k} s_js_lt_l + \sum_{j=1}^{k-1} \sum_{l=j+1}^k s_js_lt_j \\ =&~ \sum_{j=1}^k t_js_j^2 + 2\sum_{j=1}^{k-1} \sum_{l=j+1}^k s_js_lt_j \\ \end{align} $$

As a result we conclude that $$ \lim_{c\to\infty}M_{c^{-1/2}(X_c(t_1), X_c(t_2), \ldots, X_c(t_k))} (s_1, s_2, \ldots, s_k) = \exp\left\{\frac {1} {2} \mathbf{s}^T\Sigma\mathbf{s}\right\}$$ where $$ \mathbf{s} = \begin{bmatrix} s_1 \\ s_2 \\ \vdots \\ s_k \end{bmatrix}, \Sigma_{jl} = \min\{t_j, t_l\} = t_{\min\{j, l\}} $$ i.e. $Y(t)$ is a Gaussian process such that the joint distribution of $Y(t_1), Y(t_2), \ldots, Y(t_k)$ is $\mathcal{N}(\mathbf{0}, \Sigma)$

0
On

This answer requires some background knowledge on Lévy processes.

Theorem Let $(L_t^{(n)})_{t \geq 0}$ be a sequence of Lévy processes with characteristic exponents $\psi_n$. If there exists a Lévy process $(L_t)_{t \geq 0}$ with characteristic exponent $\psi$ such that $$\lim_{n \to \infty} \psi_n(\xi) = \psi(\xi) \qquad \text{for all} \, \, \xi \in \mathbb{R}^d, \tag{1}$$ then $L^{(n)} \to L$ in finite-dimensional distribution.

Proof: We prove by induction that $$(L_{t_1}^{(n)},\ldots,L_{t_k}^{(n)}) \xrightarrow[d]{n \to \infty} (L_{t_1},\ldots,L_{t_k}) \tag{2}$$ for any $k \in \mathbb{N}$ and $0 \leq t_1 < \ldots <t_k$.

  • $k=1$: By the Lévy-Khintchine formula, we have $$\mathbb{E} e^{i \xi L^{(n)}_t} = e^{ -t \psi_n(\xi)} \xrightarrow[]{n \to \infty} e^{-t \psi(\xi)} = \mathbb{E} e^{i \xi L_t},$$ and therefore it follows from Lévy's continuity theorem that $L_t^{(n)} \to L_t$ in distribution.
  • induction hypothesis: $(2)$ holds
  • inductive step ($k \to k+1$): By the independence of the increments, we have $$\begin{align*} &\quad \mathbb{E} \exp \left( i \sum_{j=1}^{k+1} \xi_j L_{t_j}^{(n)} \right) \\ &= \mathbb{E} \left[ \exp \left( i \sum_{j=1}^{k} \xi_j L_{t_j}^{(n)} + i \xi_{k+1} L_{t_k}^{(n)} \right) \mathbb{E} \left( \exp \left( i \xi_{k+1} (L_{t_{k+1}}^{(n)}-L_{t_k}^{(n)}) \right) \mid \mathcal{F}_{t_k} \right) \right] \\ &= \mathbb{E}\exp\left(i \xi_{k+1} (L_{t_{k+1}}^{(n)}-L_{t_k}^{(n)})\right) \mathbb{E} \exp \left( i \sum_{j=1}^{k} \xi_j L_{t_j}^{(n)} + i \xi_{k+1} L_{t_k}^{(n)} \right). \tag{3} \end{align*}$$ Using the induction hypothesis (for the second term) and the Lévy-Khintchine formula (for the first one), we get $$\begin{align*} \lim_{n \to \infty} \mathbb{E} \exp \left( i \sum_{j=1}^{k+1} \xi_j L_{t_j}^{(n)} \right) &= e^{-(t_{k+1}-t_k) \psi(\xi_{k+1})} \mathbb{E} \exp \left( i \sum_{j=1}^{k} \xi_j L_{t_j} + i \xi_{k+1} L_{t_k} \right) \\ &= \mathbb{E}e^{i \xi_{k+1} (L_{t_{k+1}}-L_{t_k})} \mathbb{E} \exp \left( i \sum_{j=1}^{k} \xi_j L_{t_j} + i \xi_{k+1} L_{t_k} \right). \end{align*}$$ By performing exactly the same calculations as in $(3)$ (but now backwards), we find that the right-hand side equals $$\mathbb{E} \exp \left( \sum_{j=1}^{k+1} \xi_j L_{t_j} \right).$$ Applying Lévy's continuity theorem, we conclude $(L_{t_1}^{(n)},\ldots,L_{t_{k+1}}^{(n)}) \to (L_{t_1},\ldots,L_{t_{k+1}})$ in distribution. This finishes the proof.

Remark: The above theorem holds, under slightly stronger assumptions, in a similar fashion for Markov processes. The reason is that the finite dimensional distributions of a Markov process $(X_t)_{t \geq 0}$ are uniquely determined by the one-dimensional distributions $\mathbb{P}^x(X_t \in \cdot)$, $t \geq 0$, $x \in \mathbb{R}^d$.


In order to solve your problem, it just remains to verify that

  • $X_c(t) =N_c(t)-ct$ is a Lévy process
  • the characteristic exponent of $X_c$ converges to $\psi(\xi) = |\xi|^2/2$ as $c \to \infty$.