What is the distribution of the subtract of two random variables?

2.5k Views Asked by At

Definition) A stochastic process $\{X(t), t \geqslant 0\}$ is said to be Brownian motion process with drift coefficient $\mu$ and variance parameter $\sigma^2$, if it satisfies that

  1. $X(0)=0$.
  2. $\{X(t), t \geqslant 0\}$ has independent and stationary increments.
  3. For every $t \gt 0$, $X(t) \sim N(\mu t, \sigma^2 t)$.

Let $\{X(t), t \geqslant 0\}$ be a Brownian motion process with drift coefficient $0$ and variation parameter $1$. Then, for every $t$, $X(t)$ is normally distributed with mean $0$ and variation $t$.

According to the texts, for $s \lt t$, $X(t)-X(s)$ is normally distributed with mean $0$ and variation $t-s$.

Can someone explain why it is?


My trial:

First, Since $X(t)$ and $X(s)$ are normally distributed, linear combination of them is also normally distributed although they are dependent of each.

By using the fact that $E[X(t)]=0$ and $Var[X(t)]=E[X^2(t)]=t$, \begin{align} E\left[X(t)-X(s)\right] &= E\left[X(t)\right] - E\left[X(s)\right] = 0 \\ Var\left[X(t)-X(s)\right] &= E\left[ \left\{X(t)-X(s)\right\}^2 \right] \\ &= E\left[ X^2(t) - 2X(t)X(s) + X^2(s) \right] \\ &= E\left[ X^2(t)\right] + E\left[X^2(s) \right] - 2E\left[X(t)X(s)\right] \\ &= t+s - 2E\left[X(t)X(s)\right] \\ &= t+s - 2\iint_{\Bbb{R}^2} xy f_{X(t), X(s)}(x, y) dtds \end{align}

Since $X(t)$ is not independent of $X(s)$, I cannot write $f_{X(t), X(s)}(x, y)$ as $$f_{X(t)}(x) f_{X(s)}(y)=\frac{1}{2\pi\sqrt{ts}}\exp{\left\{ -\frac{x^2}{2t} -\frac{y^2}{2s} \right\}}$$

Please let me know why $$X(t)-X(s) \sim N(0, t-s)$$

2

There are 2 best solutions below

3
On BEST ANSWER

To get $E[X(s)E(t)]$, you could do \begin{align*} E[X(s)E(t)]& = E[X(s)\{X(t)-X(s)+X(s)\}]\\ &=E[X(s)\{X(t)-X(s)\}]+E[\{X(s)\}^2]\\ &=E[X(s)]E[X(t)-X(s)]+\text{Var}(X(s))+\{E[X(s)]\}^2\tag 1\\ &=0\cdot 0+ s+0^2\\ &=s \end{align*} where $(1)$ is true since $X(s)$ and $X(t)-X(s)$ are independent.

1
On

Hint: The Moment Generation Function for the sum of independent random variables has an interesting property.

$$\begin{align}\mathsf M_X(z) :=& ~ \mathsf E(\mathsf e^{zX})\\[2ex] \mathsf M_{X+Y}(z) =&~ \mathsf E(\mathsf e^{z(X+Y)}) \\[1ex] \mathop{=}^{\small X\perp Y}&~ \mathsf E(\mathsf e^{zX})~\mathsf E(\mathsf e^{zY}) \\[1ex]=&~ \mathsf M_X(z)~\mathsf M_Y(z)\end{align}$$

Further, the moment generating function of a normally distributed random variable, $X\sim\mathcal N(\mu_{\lower{0.5ex}X}, \sigma_{\lower{0.5ex}X}^2)$, is known: $$\mathsf M_X(z) = \mathsf e^{(z\mu_{\lower{0.25ex}X}+z^2\sigma_{\lower{0.25ex}X}^2/2)}$$


Now $X(s)$ and $X(t)$ are not independent, however, $X(s)$ is independent of $[X(t)-X(s)]$ due to property (2) "$\{X(t): t\geqslant 0\}$ has independent stationary increments".   They are values from disjoint intervals.

Also $X(s)\sim \mathcal N(s\mu, s\sigma^2)$ and $X(t)\sim \mathcal N(t\mu, t\sigma^2)$ due to property (3).

The moment generating function of a sum of independent random variables is the product of their moment generating functions, so:

$$\mathsf M_{X(t)}(z) = \mathsf M_{[X(t)-X(s)]}(z)~\mathsf M_{X(s)}(z)$$

Find and identify $\mathsf M_{[X(t)-X(s)]}(z)$ .