Given a random walk with shifted exponential increment, how to calculate the expected sum distance to the origin?

136 Views Asked by At

Suppose $\{X_i, i=1,2,\ldots\}$ are i.i.d. random variables with exponential distribution and mean $\mu$.

Consider a random walk as follows.

$$S_1=0$$

$$S_{i+1}= \begin{cases} S_i+X_i-k, & \text{if $S_i \ge 0$,} \\ X_i-k, & \text{if $S_i < 0$} \end{cases}$$ , where $k$ is a given constant greater than 0.

How to calculate the value of $\mu$ such that the expected sum distance from $S_i$ to the point 0 for the first $R$ steps is minimized, i.e., $\arg \min_u E[\sum_{i=1}^{R}|S_i|]$.

Thank you.

So far, I only have an approximate solution for two extreme cases, i.e., when $\mu>>k$ and when $\mu<<k$. However, I still have no idea about how to calculate the expected sum distance when the value of $\mu$ and $k$ are comparable, e.g., when $\mu=60$ and $k=100$.

The hitting time of a similar random walk has been asked before here.

2

There are 2 best solutions below

1
On BEST ANSWER

Let me instead consider the problem in the limit $R\to\infty$, which is essentially the problem of minimizing $\lim_{n\to\infty} \mathbf{E}[|S_n|]$ as a function of $\mu$.


When $\mu \geq k$, $(S_n)$ behaves like a reflected random walk, hence $\lim_{n\to\infty} \mathbf{E}[|S_n|] = \infty$. In light of this, let us consider the case $\mu < k$.

Then for arbitrary initial distribution, $(S_n)$ converges in distribution to some $S_{\infty}$ whose law does not depend on the initial distribution. (This has to do with the fact that $S_n < 0$ for some $n$ with probability one, and when this happens the tail behaves exactly the same as $(S_n)$ started at $0$.)

Moreover, this $S_{\infty}$ solves the distributional identity

$$ S_{\infty} \stackrel{d}= \max\{0, S_{\infty}\} + X - k, \tag{*}$$

where $X$ is an exponential random variable with mean $\mu$ independent of $S_{\infty}$. We claim:

Claim. $S_{\infty} \stackrel{d}= -k + Y$, where $Y$ has an exponential distribution with mean $\beta$ solving the equation

$$ \mu = (1 - e^{-k/\beta})\beta. $$

(Note that the above equation has a unique solution $\beta \in (0, \infty)$ for each $\mu \in (0, k)$.)

We defer the proof to the end. Assuming this,

$$ \mathbf{E}[|S_{\infty}|] = k - \beta + 2\beta e^{-k/\beta} $$

and this is minimized when $\beta = \beta_1 k$ where $\beta_1 \approx 0.595824$ is the unique positive solution of the equation $2(1 + \beta_1) = \beta_1 e^{-1/\beta_1}$. The corresponding value of $\mu$ is therefore

$$ \mu = k \beta_1 (1 - e^{-1/\beta_1}) \approx (0.484594\ldots) k $$


Proof of Claim. It suffices to show that the equation $\text{(*)}$ is satisfied by the proposed distribution. To this end, we compute the Laplace transform of both sides of $\text{(*)}$. A simple computation tells that

$$ \mathbf{E}[e^{-t S_{\infty}}] = \frac{e^{kt}}{1 + \beta t}, $$

whereas

$$ \mathbf{E}[e^{-t(\max\{0, S_{\infty}\} + X - k)}] = \frac{e^{kt}}{1 + \mu t}\left( 1 + e^{-k/\beta} + \frac{e^{-k/\beta}}{1 + \beta t} \right). $$

It is now easy to check that these two Laplace transforms coincide when $\beta$ is related to $\mu$ as in the claim, completing the proof. $\square$

0
On

We have solved this problem when $R \rightarrow \infty$, i.e., $\arg \min_u \lim_{R\rightarrow\infty}E[\sum_{i=1}^{R}|S_i|]$. I post it here in case someone will need it in the future.

Since $\lim_{R\rightarrow\infty}E[\sum_{i=1}^{R}|S_i|]=\lim_{R\rightarrow\infty}\sum_{i=1}^{R}E[|S_i|]=\lim_{i\rightarrow\infty} R \cdot E[|S_i|]=R \cdot \lim_{i\rightarrow\infty} E[|S_i|]$, we can get the following equation.

$$\arg \min_u \lim_{R\rightarrow\infty}E[\sum_{i=1}^{R}|S_i|]=\arg \min_u \lim_{R\rightarrow\infty}E[|S_i|]$$

Thus, we try to find the value of $\mu$ that can minimize $\lim_{i\rightarrow\infty}E[|S_i|]$.

  1. When $\mu>k$, it is obvious that $\lim_{i\rightarrow\infty}E[|S_i|] \rightarrow \infty$. Thus, it is impossible for $\mu>k$ to minimize the expected sum distance, and in the rest of this answer, we only consider $\mu \le k$.

  2. When $\mu \le k$, in order ot calculate $\lim_{i\rightarrow\infty}E[|S_i|] \rightarrow \infty$, we try to calculate the limiting distribution of $S_i$, which is also the stationary distribution of $S_i$.

Denote the probability dense function of $S_i$ as $f_i(x)$ and the the probability cumulative function of $S_i$ as $F_i(x)$. In what follows, we try to find a stationary distribution $f(x)$ satisfying $f(x)=f_N(x)=f_{N+1}(x)$.

Denote the probability dense function of joint probablity of $(S_i,X_i)$ as $p(t,s)$ for $x \ge -k$. We can get $p(t,s)=f_N(t) \cdot \frac{1}{\mu}e^{-\frac{s}{u}}$. Then,

\begin{align} & F_{N+1} (x) = p(S_{N+1} \le x) \\ =& p(S_N \le 0, X_N - k \le x) + p(S_n > 0, S_N + X_N - k \le x) \\ =& p(S_N \le 0) \cdot p(X_N \le x+k) + p(0 < S_N \le x+k,0 \le X_N \le x + k - S_N) \\ =& F_N(0) \cdot \int_0^{x+k} \frac{1}{\mu} e^{-\frac{t}{\mu}} dt + \int_0^{x+k} \left(\int_0^{x+k-t} p(t,s) ds\right) dt \\ =& F_N(0) \cdot \left(1 - e^{-\frac{1}{\mu} (x+k)}\right) + \int_0^{x+k} f_N(t)\left(\int_0^{x+k-t} \frac{1}{\mu}e^{-\frac{s}{\mu}} ds\right) dt\\ =& F_N(0) \cdot \left(1 - e^{-\frac{1}{\mu}(x+k)}\right) + \int_0^{x+k}f_N(t)\left(1-e^{-\frac{1}{\mu}(x+k-t)}\right) dt\\ \end{align}

Thus, we have

$$F_{N+1}=F_N(0) \cdot (1 - e^{-\frac{1}{\mu}(x+k)}) + \int_0^{x+k}f_N(t)(1-e^{-\frac{1}{\mu}(x+k-t)}) dt$$

Derive $x$ on both sides,

$$f_{N+1}(x)=\frac{1}{\mu}e^{-\frac{1}{\mu(x+k)}}\left[F_N(0) + \int_0^{x+k}f_N(t)e^{\frac{t}{\mu}} dt\right]$$

$$\mu \cdot e^{\frac{1}{\mu(x+k)}} f_{N+1}(x)=\frac{1}{\mu} F_N(0) + \int_0^{x+k}f_N(t)e^{\frac{t}{\mu}} dt $$

Derive $x$ on both sides again,

$$\mu \cdot \frac{1}{\mu} \cdot e^{\frac{1}{\mu}(x+k)}f_{N+1}(x) + \mu \cdot e^{\frac{1}{\mu}(x+k)} f'_{N+1} (x) = f_N(x+k) \cdot ^{\frac{x+k}{\mu}}$$

$$\mu f'_{N+1}(x) + f_{N+1}(x) = f_N(x+k)$$

Since $f(x)=f_N(x)=f_{N+1}(x)$,

$$ f(x+k) - f(x) = \mu f'(x)$$

From this equation, we can get $f(x)=\lambda e^{-\lambda(x+k)}$ and $e^{-k\lambda}+\mu \lambda=1$.

Thus,

\begin{align} \lim_{n \rightarrow \infty} E[|S_N|] &= \int_{-k}^0 -x \cdot \lambda e^{-\lambda(x+k)} dx + \int_0^{+\infty} x \cdot \lambda e^{-\lambda (x+k)} dx \\ &=(k-\frac{1}{\lambda}+\frac{1}{\lambda}e^{-k\lambda}) + \frac{1}{\lambda} e^{-k\lambda} \\ &= \frac{2e^{-k\lambda}}{\lambda} + k \end{align}

For any $k$, we aim to minimize $g(\lambda) = \frac{2e^{-k\lambda} - 1}{\lambda}$. Combining $g'(\lambda) = \frac{1 - 2(1 + k \lambda)e^{- k \lambda}}{\lambda^2}=0$ and $e^{-k\lambda} + \mu \lambda=1$, we can get $\mu \approx (0.4845\ldots) \cdot k$.