I am trying to show the following:
Let $\{N_{c}(t)\}$ be a poisson process with parameter $c \ge 0$. Define the process $X_{c}(t)=N_{c}(t)-ct$ for $t \ge 0.$
Show for every $k \in \mathbb{N}$ and $(t_1,...,t_k) \in \mathbb{R}^+$ the random vector $\frac{1}{c^{1/2}}(X_{c}(t_1),...,X_{c}(t_k))$ converges in distribution to $(Y(t_1),...,Y(t_k))$ where $\{Y(t)\}$ is a Gaussian process. Describe this process.
My thoughts are this can be shown via characteristic functions similarly to how the CLT is proven, that is $\phi_{\frac{1}{c^{1/2}}N_{c}-ct}(t)$ converges to a characteristic function of a multivariate normal for each $t$ and $k$. This would be sufficient to show the resulting limit is a Gaussian process. I am struggling to actually show this algebraically.
As far as describing this limit, is it interpreted similarly to the CLT? Since the poisson process is a counting process, does the limit Gaussian process imply that the arithmetic mean of poisson process after sufficiently many time steps will follow a multivariate normal distribution?
WLOG we can always label the $t$ such that $t_1 < t_2 < \ldots < t_k$
The joint moment generating function of $c^{-1/2}(X_c(t_1), X_c(t_2), \ldots, X_c(t_k))$ is
$$ \begin{align} &M_{c^{-1/2}(X_c(t_1), X_c(t_2), \ldots, X_c(t_k))}(s_1, s_2, \ldots, s_k) \\ =&~ E\left[\exp\left\{\frac {1} {\sqrt{c}} \sum_{j=1}^k s_jX_c(t_j)\right\}\right] \\ =&~ E\left[\exp\left\{\frac {1} {\sqrt{c}} \sum_{j=1}^k s_j(N_c(t_j) - ct_j)\right\}\right] \\ =&~ \exp\left\{-\sqrt{c}\sum_{j=1}^k s_jt_j\right\}E\left[\exp\left\{\frac {1} {\sqrt{c}} \sum_{j=1}^k \sum_{l=1}^j s_{k-l+1} [N_c(t_{k-j+1}) - N_c(t_{k-j})]\right\}\right] \\ =&~ \exp\left\{-\sqrt{c}\sum_{j=1}^k s_jt_j\right\}\prod_{j=1}^k E\left[\exp\left\{\frac {1} {\sqrt{c}} \sum_{l=1}^j s_{k-l+1} [N_c(t_{k-j+1}) - N_c(t_{k-j})]\right\}\right] \\ =&~ \exp\left\{-\sqrt{c}\sum_{j=1}^k s_jt_j\right\}\prod_{j=1}^k \exp\left\{c(t_{k-j+1}-t_{k-j}) \left(\exp\left\{\frac {1} {\sqrt{c}} \sum_{l=1}^j s_{k-l+1} \right\} - 1\right)\right\} \\ =&~ \exp\left\{-\sqrt{c}\sum_{j=1}^k s_jt_j + c\sum_{j=1}^k (t_{k-j+1}-t_{k-j}) \sum_{m=1}^{\infty} \frac {1} {m!}\left(\frac {1} {\sqrt{c}} \sum_{l=1}^j s_{k-l+1}\right)^m \right\}\\ \end{align} $$ where $t_0 = 0$ is introduced in the 4th line. Note that
$$ \begin{align} &~\sum_{j=1}^k (t_{k-j+1}-t_{k-j}) \sum_{l=1}^j s_{k-l+1} \\ =&~ \sum_{j=1}^k t_{k-j+1}\sum_{l=1}^j s_{k-l+1} - \sum_{j=1}^kt_{k-j} \sum_{l=1}^j s_{k-l+1} \\ =&~ \sum_{j=0}^{k-1} t_{k-j}\sum_{l=1}^{j+1} s_{k-l+1} - \sum_{j=1}^k t_{k-j} \sum_{l=1}^j s_{k-l+1} \\ =&~ t_ks_k + \sum_{j=1}^{k-1} t_{k-j}s_{k-j} - t_0\sum_{l=1}^k s_{k-l+1} \\ =&~ \sum_{j=1}^{k} t_{j}s_{j} \end{align}$$
So we have $$\begin{align} &~ \exp\left\{-\sqrt{c}\sum_{j=1}^k s_jt_j + c\sum_{j=1}^k (t_{k-j+1}-t_{k-j}) \sum_{m=1}^{\infty} \frac {1} {m!}\left(\frac {1} {\sqrt{c}} \sum_{l=1}^j s_{k-l+1}\right)^m \right\}\\ =&~ \exp\Bigg\{-\sqrt{c}\sum_{j=1}^k s_jt_j + \sqrt{c}\sum_{j=1}^k (t_{k-j+1}-t_{k-j}) \sum_{l=1}^j s_{k-l+1} \\ &~ + \frac {1} {2} \sum_{j=1}^k (t_{k-j+1}-t_{k-j}) \left(\sum_{l=1}^j s_{k-l+1}\right)^2 + \sum_{j=1}^k (t_{k-j+1}-t_{k-j}) \sum_{m=3}^{\infty} \frac {1} {c^{m/2-1}} \frac {1} {m!}\left(\sum_{l=1}^j s_{k-l+1}\right)^m \Bigg\}\\ \end{align}$$ The first two terms cancelled out as shown, and the last term vanish as $c \to \infty$. Therefore, $$ \lim_{c\to\infty}M_{c^{-1/2}(X_c(t_1), X_c(t_2), \ldots, X_c(t_k))} (s_1, s_2, \ldots, s_k) = \exp\left\{\frac {1} {2} \sum_{j=1}^k (t_{k-j+1}-t_{k-j}) \left(\sum_{l=1}^j s_{k-l+1}\right)^2\right\}$$
Using similar tricks, $$ \begin{align} &~ \sum_{j=1}^k (t_{k-j+1}-t_{k-j}) \left(\sum_{l=1}^j s_{k-l+1}\right)^2 \\ =&~ \sum_{j=0}^{k-1} t_{k-j}\left(\sum_{l=1}^{j+1} s_{k-l+1}\right)^2 - \sum_{j=1}^kt_{k-j} \left(\sum_{l=1}^j s_{k-l+1}\right)^2 \\ =&~ t_ks_k^2 + \sum_{j=1}^{k-1} t_{k-j}\left(2\sum_{l=1}^{j} s_{k-l+1} + s_{k-j}\right)s_{k-j} - t_0 \left(\sum_{l=1}^k s_{k-l+1}\right)^2 \\ =&~ \sum_{j=1}^k t_js_j^2 + 2 \sum_{j=1}^{k-1} \sum_{l=1}^{j} t_{k-j}s_{k-j}s_{k-l+1} \\ =&~ \sum_{j=1}^k t_js_j^2 + 2 \sum_{j=1}^{k-1} \sum_{l=j+1}^{k} t_{j}s_{j}s_{l} \\ \end{align}$$
On the other hand, $$ \begin{align} &~ \sum_{j=1}^k\sum_{l=1}^k s_js_l\min\{t_j, t_l\} \\ =&~ \sum_{j=1}^k t_js_j^2 + \sum_{j=2}^k\sum_{l=1}^{j-1} s_js_lt_l + \sum_{j=1}^{k-1} \sum_{l=j+1}^k s_js_lt_j \\ =&~ \sum_{j=1}^k t_js_j^2 + \sum_{l=1}^{k-1}\sum_{j=l+1}^{k} s_js_lt_l + \sum_{j=1}^{k-1} \sum_{l=j+1}^k s_js_lt_j \\ =&~ \sum_{j=1}^k t_js_j^2 + 2\sum_{j=1}^{k-1} \sum_{l=j+1}^k s_js_lt_j \\ \end{align} $$
As a result we conclude that $$ \lim_{c\to\infty}M_{c^{-1/2}(X_c(t_1), X_c(t_2), \ldots, X_c(t_k))} (s_1, s_2, \ldots, s_k) = \exp\left\{\frac {1} {2} \mathbf{s}^T\Sigma\mathbf{s}\right\}$$ where $$ \mathbf{s} = \begin{bmatrix} s_1 \\ s_2 \\ \vdots \\ s_k \end{bmatrix}, \Sigma_{jl} = \min\{t_j, t_l\} = t_{\min\{j, l\}} $$ i.e. $Y(t)$ is a Gaussian process such that the joint distribution of $Y(t_1), Y(t_2), \ldots, Y(t_k)$ is $\mathcal{N}(\mathbf{0}, \Sigma)$