$E[N^{r}(t)]<\infty$ for all $t\geq0,\ r\geq0$.

85 Views Asked by At

Proposition 3.2.2 from Ross' Stochastic Processes says that $m(t)<\infty$ for all $0\leq t<\infty$, where$$m(t)=E[N(t)],\ N(t)=\sup\{n:S_{n}\leq t\},\ S_{n}=\sum_{i=1}^{n}X_{i}$$and $\{X_{n},n=1,2,\cdots\}$ is a sequence of nonnegative independent random variables with a common distribution $F$. To avoid trivialities suppose that $F(0)=P\{X_{n}=0\}<1$. Below is the proof:

Since $P\{X_{n}=0\}<1$, it follows by the continuity property of probabilities that there exists an $\alpha>0$ such that $P\{X_{n}\geq\alpha\}>0$. Now define a related renewal process $\{\bar{X}_{n},n\geq1\}$ by$$\bar{X}_{n}=\begin{cases} 0 & \text{if}\ X_{n}<\alpha,\\ \alpha & \text{if}\ X_{n}\geq\alpha,\\ \end{cases}$$and let $\bar{N}(t)=\sup\{n:\bar{X}_{1}+\cdots+\bar{X}_{n}\leq t\}$. Then it is easy to see that for the related process, renewals can only take place at times $t=n\alpha,\ n=0,1,2,\cdots$, and also the number of renewals at each of theses times are independent geometric random variables with mean $\frac{1}{P\{X_{n}\geq\alpha\}}$. Thus,$$E[\bar{N}(t)]\leq\dfrac{t/\alpha+1}{P\{X_{n}\geq\alpha\}}<\infty,$$and the result follows since $\bar{X}_{n}\leq X_{n}$ implies that $\bar{N}(t)\geq N(t)$.

But I'm confused by the following remark which says the above proof also shows that $E[N^{r}(t)]<\infty$ for all $t\geq0,\ r\geq0$. Any help would be appreciate.

1

There are 1 best solutions below

0
On BEST ANSWER

The function $\overline{S}_n=\sum_{1\leq k\leq n}\overline{X}_k$ is nondecreasing at takes the values $\{0,\alpha,2\alpha,...\}$. The function $\overline{N}(t)=\sup\{n:\overline{S}_n\leq t\}$ is s.t. $$\overline{N}(t)=\begin{cases}\overline{N}(0)&t \in [0,\alpha)\\ \overline{N}(t)=\overline{N}(0)+\sum_{1\leq k\leq n}(\overline{N}(k\alpha)-\overline{N}((k-1)\alpha))&t \in [n\alpha,(n+1)\alpha)\end{cases}$$ Now, since if $\overline{X}_1=\alpha$ we have $\overline{N}(0)=0$ we get $$\overline{N}(0)\sim \textrm{Geom}_{\mathbb{N}_0}(P(X_1\geq \alpha))\implies E[\overline{N}(0)]=\frac{P(X_1<\alpha)}{P(X_1\geq \alpha)}$$ $$\overline{N}(k\alpha)-\overline{N}((k-1)\alpha)\sim \textrm{Geom}_{\mathbb{N}}(P(X_1\geq \alpha))\implies E[\overline{N}(k\alpha)-\overline{N}((k-1)\alpha)]=\frac{1}{P(X_1\geq \alpha)}$$ Now we use the trick $|x_1+x_2+...+x_n|^p\leq n^{p}(|x_1|^p+|x_2|^p+...+|x_n|^p)$ and the fact that geometric rvs have all moments, and we conclude.