Probabilistic exponential growth model

174 Views Asked by At

I have a real valued number $y_t$. At each time step t, $y_t$ is multiplied by $(1 + \epsilon)$ with probability $p$ and multiplied by $(1 - \epsilon)$ with probability $1 - p$. What is the expected value of $y_{t+n}$? What is the variance?

I know there must be a type of model for this, maybe some sort of random walk?

3

There are 3 best solutions below

0
On BEST ANSWER

It is quite simple if you use independence in a more direct way and the method works for any distribution. Assuming that $y_n=\prod_{1\leq k\leq n} X_k$ where the $X_k$ are i.i.d. factors. By independence: $$ {\Bbb E} y_n = {\Bbb E} \prod_k X_k = \prod_k {\Bbb E} X_k=({\Bbb E} X_k)^{n}$$ valid for any distribution. In the specific (Bernoulli) example: ${\Bbb E}X_k = (1+\epsilon)p + (1-\epsilon) (1-p)$. Similarly $${\Bbb E} y_n^2 = {\Bbb E} \prod_k X_k^2 = \prod_k {\Bbb E} X_k^2=({\Bbb E} X_k^2)^{n}$$ again valid for any distribution. In our case: ${\Bbb E} X_k^2 = (1+\epsilon)^2 p + (1-\epsilon)^2 (1-p)$. In particular, ${\rm var \ } y_n = ({\Bbb E} X_k^2)^n - (({\Bbb E} X_k)^{2n}$ and you may carry on from there, in order to calculate limits etc... (e.g. $n\rightarrow \infty$, $\epsilon n\rightarrow \lambda$ gives a nice limit). The actual distribution of $y_n$ is in general quite complicated.

0
On

Assuming independence and regarding the expectation:

For the sake of simplicity, let $y_0=1$. The value of our variable, at the $n^{th}$ moment, is

$$(1+\varepsilon)^k(1-\varepsilon)^{n-k}$$

with probability $${n \choose k}p^k(1-p)^{n-k}. $$

(Where $k$ denotes the number of multiplications by $1+\varepsilon$.)

The expected value is, then

$$\sum_{k=0}^n(1+\varepsilon)^k(1-\varepsilon)^{n-k}{n \choose k}p^k(1-p)^{n-k}=$$ $$=\sum_{k=0}^n{n \choose k}\left[(1+\varepsilon)p\right]^k\left[(1-\varepsilon)(1-p)\right]^{n-k}=$$ $$=\left[(1+\varepsilon)p+(1-\varepsilon)(1-p)\right]^n=\left[1+2\varepsilon p-\varepsilon\right]^n,$$

because of the binomial theorem.

Unfortunately I could not find the trick for the calculation of the expectation of the square of the same random variable...

0
On

Following up zoli's answer, I think $\mathbb{E}(y_{t}^{2})$ can be found the same way, with the Binomial theorem yielding (assuming $y_0=1$)

$$\mathbb{E}(y_{t}^{2})=y^{2}_{0}\mathbb{E}((1+\epsilon)^2(1-\epsilon)^{2})=[(1+\epsilon)^2p+(1-\epsilon)^{2}(1-p)]^{n}=[(1-\epsilon)^2+4\epsilon p]^{n}.$$

So the variance is $\sigma_{n}^2=[(1-\epsilon)^2+4\epsilon p]^{n}-[1+2\epsilon p-\epsilon]^{2n}$. I don't know if it can be simplified any further.

Quick check: as $\epsilon\rightarrow 0$, $\sigma_{n}\rightarrow 0$.