Suppose $\{X_i, i=1,2,\ldots\}$ are i.i.d. random variables with exponential distribution and mean $\mu$.
Consider a random walk as follows.
$$S_1=0$$
$$S_{i+1}= \begin{cases} S_i+X_i-k, & \text{if $S_i \ge 0$,} \\ X_i-k, & \text{if $S_i < 0$} \end{cases}$$ , where $k$ is a given constant greater than 0.
How to calculate the value of $\mu$ such that the expected sum distance from $S_i$ to the point 0 for the first $R$ steps is minimized, i.e., $\arg \min_u E[\sum_{i=1}^{R}|S_i|]$.
Thank you.
So far, I only have an approximate solution for two extreme cases, i.e., when $\mu>>k$ and when $\mu<<k$. However, I still have no idea about how to calculate the expected sum distance when the value of $\mu$ and $k$ are comparable, e.g., when $\mu=60$ and $k=100$.
The hitting time of a similar random walk has been asked before here.
Let me instead consider the problem in the limit $R\to\infty$, which is essentially the problem of minimizing $\lim_{n\to\infty} \mathbf{E}[|S_n|]$ as a function of $\mu$.
When $\mu \geq k$, $(S_n)$ behaves like a reflected random walk, hence $\lim_{n\to\infty} \mathbf{E}[|S_n|] = \infty$. In light of this, let us consider the case $\mu < k$.
Then for arbitrary initial distribution, $(S_n)$ converges in distribution to some $S_{\infty}$ whose law does not depend on the initial distribution. (This has to do with the fact that $S_n < 0$ for some $n$ with probability one, and when this happens the tail behaves exactly the same as $(S_n)$ started at $0$.)
Moreover, this $S_{\infty}$ solves the distributional identity
$$ S_{\infty} \stackrel{d}= \max\{0, S_{\infty}\} + X - k, \tag{*}$$
where $X$ is an exponential random variable with mean $\mu$ independent of $S_{\infty}$. We claim:
We defer the proof to the end. Assuming this,
$$ \mathbf{E}[|S_{\infty}|] = k - \beta + 2\beta e^{-k/\beta} $$
and this is minimized when $\beta = \beta_1 k$ where $\beta_1 \approx 0.595824$ is the unique positive solution of the equation $2(1 + \beta_1) = \beta_1 e^{-1/\beta_1}$. The corresponding value of $\mu$ is therefore
$$ \mu = k \beta_1 (1 - e^{-1/\beta_1}) \approx (0.484594\ldots) k $$
Proof of Claim. It suffices to show that the equation $\text{(*)}$ is satisfied by the proposed distribution. To this end, we compute the Laplace transform of both sides of $\text{(*)}$. A simple computation tells that
$$ \mathbf{E}[e^{-t S_{\infty}}] = \frac{e^{kt}}{1 + \beta t}, $$
whereas
$$ \mathbf{E}[e^{-t(\max\{0, S_{\infty}\} + X - k)}] = \frac{e^{kt}}{1 + \mu t}\left( 1 + e^{-k/\beta} + \frac{e^{-k/\beta}}{1 + \beta t} \right). $$
It is now easy to check that these two Laplace transforms coincide when $\beta$ is related to $\mu$ as in the claim, completing the proof. $\square$