Integrating probabilities

141 Views Asked by At

My following problem is of general nature, here is an example to illustrate it.

For example let $\left(\xi_i\right)_{i \geq 1}$ be independent and identically Exp(1) distributed random variables. We define for $t\geq 0$, $$ X\left(t\right) = \max \left\{n\geq 0: \sum_{k=1}^n \xi_k \leq t \right\}. $$ Of course $S_n:=\sum_{k=1}^n \xi_k $ is $\Gamma\left(1,n\right)$ distributed. One want's to show that $X\left(t\right)$ is a Poisson-Prozess and calculates $P\left(X\left(t\right)=n\right)= P\left(S_n \leq t, S_{n+1} > t \right) = P\left(S_n \leq t, S_n + \xi_{n+1} > t \right).$ Now the following kind of transformation, I never get (because this is no usual $\sigma$-addidivity)

$$ P\left(S_n \leq t, S_n + \xi_{n+1} > t \right) = \int_0^t f_{S_n}\left(y\right) P\left(\xi_{n+1} > t-y \right) \text{d}y. $$

($f_{S_n}$ is of course the density of $S_n$) Could someone explain it, to me, why these expressions are equal??

A similar problem is also. Suppose $\left(X\left(t\right)\right)_{t \geq 0}$ is a continous time Markov process with values in a countable set $E$ with right-continous sample paths. I know already that the time spent in each state $i \in E$ (before transitioning to some other state) is exponentially distributed with some parameter $q\left(i\right) \geq 0.$ With the definition $\omega\left(j\right) = \inf \left\{t > 0: X\left(t\right)=j, X\left(t-0\right) \neq j \right\}$, I want to show for $i \neq j$, that $ P_i \left(X\left(t\right) = j \right) >0 $ if and only if $ P_i\left(\omega\left(j\right) \leq t \right) > 0. $ Now why can you just do some calculation like the following

\begin{align*} P_i\left(X\left(t\right) = j \right) &= \int_0^t P_i\left(X\left(t\right) = j, \omega\left(j\right) = \text{d}s\right) \\ &\geq P_i\left(X\left(u\right) = j \text{ for all } u \in \left[ds,t\right] | X\left(ds\right) = j \right) \cdot P_i\left(\omega\left(j\right) = \text{d}s \right) \\ &\geq e^{-q\left(j\right)t} \cdot P_i\left(\omega\left(j\right) \leq t \right) > 0. \end{align*}

Particularly in the second case, it's not clear to me, if you go to the definition of the Lebesgue-integral how you can apply the Markov-property in a clean way.

For answers, I would be very happy and grateful.

1

There are 1 best solutions below

5
On

Re your first question, note that for every independent random variables $(X,Y)$, $X$ with density $f$, and every events $(A,B)$, $$P(X\in A,X+Y\in B)=\int_Af(x)P(Y+x\in B)\mathrm dx.$$ To show this, note that, in full generality, $$P(X\in A,X+Y\in B)=\iint \mathbf 1_{x\in A}\mathbf 1_{x+y\in B}\mathrm dP_{(X,Y)}(x,y),$$ hence, by the independence of $(X,Y)$, $$P(X\in A,X+Y\in B)=\int \mathbf 1_{x\in A}\left(\int\mathbf 1_{x+y\in B}\mathrm dP_Y(y)\right)\mathrm dP_X(x).$$ The inner parenthesis is $P(x+Y\in B)$ and $P_X$ has density $f$ hence the proof is complete.

Re your second question, there are some serious misprints in your computations, but, considering the hitting time $T_j$ of $j$ (your $\omega(j)$), one has $$P_i(X_t=j)=P_i(X_t=j,T_j\leqslant t)=\int_0^tP_i(X_t=j\mid T_j=s)\mathrm dP_{T_j}(s).$$ For every $s\leqslant t$, $$P_i(X_t=j\mid T_j=s)\geqslant P_i(\forall u\in[s,t],X_u=j\mid T_j=s).$$ The Markov property at time $s$ indicates that the RHS is equal to $$P_j(\forall u\in[0,t-s],X_u=j)=\mathrm e^{-q(j)(t-s)}\geqslant\mathrm e^{-q(j)t},$$ hence $$P_i(X_t=j)\geqslant\int_0^t\mathrm e^{-q(j)t}\mathrm dP_{T_j}(s)=\mathrm e^{-q(j)t}P(T_j\leqslant t).$$