Consider the problem: Let $X_t = \mu t+ \sigma B_t$ with $\mu, \sigma >0$ where $B_t$ is a standard BM. Fix $\beta>0$, let $\tau = \inf\{t: X_t = \beta\}$. Calculate $$\mathbb{E}\int_0^\tau(\beta - X_t)dt$$ Here's my thought: since $\beta - X_t$ is continuous, $$\mathbb{E}\int_0^\tau(\beta - X_t)dt = \int_0^\tau\mathbb{E}(\beta - X_t)dt =\int_0^\tau(\beta - \mu t - \sigma\mathbb{E}(B_t))dt = \beta \tau - \frac{1}{2}\mu \tau^2$$ The problemis, I did not use the properties of the stopping time $\tau$. This makes me feel weird. Can someone please let me know where I made a mistake?
2026-04-03 05:24:37.1775193877
Expectation of integration Brownian motion at stopping time.
391 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in STOCHASTIC-PROCESSES
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
- Probability being in the same state
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Why does there exists a random variable $x^n(t,\omega')$ such that $x_{k_r}^n$ converges to it
- Compute the covariance of $W_t$ and $B_t=\int_0^t\mathrm{sgn}(W)dW$, for a Brownian motion $W$
- Why has $\sup_{s \in (0,t)} B_s$ the same distribution as $\sup_{s \in (0,t)} B_s-B_t$ for a Brownian motion $(B_t)_{t \geq 0}$?
- What is the name of the operation where a sequence of RV's form the parameters for the subsequent one?
- Markov property vs. transition function
- Variance of the integral of a stochastic process multiplied by a weighting function
Related Questions in BROWNIAN-MOTION
- Compute the covariance of $W_t$ and $B_t=\int_0^t\mathrm{sgn}(W)dW$, for a Brownian motion $W$
- Why has $\sup_{s \in (0,t)} B_s$ the same distribution as $\sup_{s \in (0,t)} B_s-B_t$ for a Brownian motion $(B_t)_{t \geq 0}$?
- Identity related to Brownian motion
- 4th moment of a Wiener stochastic integral?
- Optional Stopping Theorem for martingales
- Discontinuous Brownian Motion
- Sample path of Brownian motion Hölder continuous?
- Polar Brownian motion not recovering polar Laplacian?
- Uniqueness of the parameters of an Ito process, given initial and terminal conditions
- $dX_t=\alpha X_t \,dt + \sqrt{X_t} \,dW_t, $ with $X_0=x_0,\,\alpha,\sigma>0.$ Compute $E[X_t] $ and $E[Y]$ for $Y=\lim_{t\to\infty}e^{-\alpha t}X_t$
Related Questions in STOCHASTIC-INTEGRALS
- Meaning of a double integral
- 4th moment of a Wiener stochastic integral?
- Cross Variation of stochatic integrals
- Stochastic proof variance
- Solving of enhanced Hull-White $dX_t = \frac{e^t-X_t}{t-2}dt + tdW_t$
- Calculating $E[exp(\int_0^T W_s dW_s)]$?
- Applying Ito's formula on a $C^1$ only differentiable function yielding a martingale
- what does it mean by those equations of random process?
- Why aren't the sample paths of this stochastic process defined?
- Is the solution to this (simple) Stochastic Differential Equation unique?
Related Questions in STOPPING-TIMES
- Need to find Conditions to get a (sub-)martingale
- What techniques for proving that a stopping time is finite almost surely?
- Discrete martingale stopping time
- Optional Stopping Theorem for martingales
- Prove that stopped discrete time nonnegative supermartingales are uniformly integrable
- optimal strategy for drawing a deck of cards
- $\frac1n \sum_{i=1}^n W_i(T_i)\to 0$ a.s. for $n\to\infty$
- Brownian Motion Hitting Time of a line with a negative axis intercept
- Random walk with barriers: estimate time since the appearance of a barrier
- Generalizing a proof for the density of stopped subordinators
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Way 1: begin with
$$\mathbb{E} \left [ \int_0^\tau \beta-X_t dt \right ] = \beta \mathbb{E}[\tau]-\frac{\mu}{2} \mathbb{E}[\tau^2]-\sigma \mathbb{E} \left [ \int_0^\tau B_t dt \right ]$$
assuming all these expectations are finite.
We want to simplify the last term. To do that, consider the differential of $Y_t=B_t^3/3$. It is
$$dY_t=B_t^2 dB_t + B_t dt$$
by the Ito formula. Thus
$$\mathbb{E} \left [ \int_0^\tau B_t dt \right ] = \mathbb{E}[Y_\tau]-\mathbb{E} \left [ \int_0^\tau B_t^2 dB_t \right ]$$
and the LHS is the thing we want to get. So we need to look at the two RHS terms. For the first one, by the definition of $\tau$:
$$Y_\tau=\frac{1}{3} \left ( \frac{\beta-\mu \tau}{\sigma} \right )^3.$$
The second term can be shown to be zero provided that we can provided that we can justify applying optional stopping to the martingale $\int_0^t B_t^2 dB_t$. So plugging everything in:
$$\mathbb{E} \left [ \int_0^\tau \beta-X_t dt \right ] = \beta \mathbb{E}[\tau] - \frac{\mu}{2} \mathbb{E}[\tau^2] - \frac{\sigma}{3} \mathbb{E} \left [ \left ( \frac{\beta-\mu \tau}{\sigma} \right )^3 \right ].$$
Thus you need the first three moments of $\tau$ and nothing else in order to finish the problem.
I think you can obtain the MGF of $\tau$ using another martingale technique, which allows you to finish the problem. However $\mathbb{E}[\tau]$ is more easily obtained as we will see below.
Way 2: begin with
$$\mathbb{E} \left [ \int_0^\tau \beta-X_t dt \right ] = \int_0^\infty \mathbb{E} \left [ \int_0^t \beta-X_s ds \mid \tau=t \right ] d\mathbb{P}(\tau=t).$$
Under these assumptions, $B_t$ goes from $0$ to $\frac{\beta-\mu t}{\sigma}$ in time $t$, and so its conditional expected value at time $s$ is $\frac{s}{t} \frac{\beta-\mu t}{\sigma}$ by the properties of the Brownian bridge. Therefore the conditional expected value of $X_s$ is $\mu s + \frac{s}{t}(\beta-\mu t)=\frac{s}{t} \beta$. Therefore
$$\mathbb{E} \left [ \int_0^\tau \beta-X_t dt \right ] = \int_0^\infty \int_0^t (1-s/t) \beta \, ds \, d \mathbb{P}(\tau=t) \\ = \int_0^\infty \frac{\beta}{2} t \, d \mathbb{P}(\tau=t) \\ = \frac{\beta}{2} \mathbb{E}[\tau].$$
I do know how to get the first moment off the top of my head: $\mathbb{E}[\tau]$ is $\lim_{a \to -\infty} u(a,0)$ where
$$\mu \partial_x u(a,x) + \frac{\sigma^2}{2} \partial^2_x u(a,x) = -1 \\ u(a,a)=0 \\ u(a,\beta)=0.$$
The intuition for the derivation of this equation is to write the equation $u(a,x)=h+\int_{\mathbb{R}} u(a,y) d\mathbb{P}(X_h=y \mid X_0=x)$ and then get an estimate of the integral for small $h$, then rearrange and send $h \to 0$.
Now for $\mu \neq 0$ we have
$$u(a,x)=c_1 + c_2 e^{-kx} - x/\mu$$
where $k=2\mu/\sigma^2$. So
$$c_1 + c_2 e^{-ka} = a/\mu \\ c_1 + c_2 e^{-k\beta} = \beta/\mu \\ u(a,0)=c_1+c_2.$$
If $\mu>0$ then we recover $\lim_{a \to -\infty} u(a,0)=\frac{\beta}{\mu}$. If $\mu<0$ then we recover $\lim_{a \to -\infty} u(a,0)=+\infty$.
In the $\mu=0$ case the situation is simpler:
$$u(a,x)=-\frac{1}{\sigma^2} (x-a)(x-\beta)$$
and so $\mathbb{E}[\tau]$ is again $+\infty$.
Thus we find that when $\mu>0$, the desired quantity is $\frac{\beta^2}{2\mu}$, and otherwise it is $+\infty$. Somewhat surprisingly, $\sigma$ does not enter this at all.