$\text{I need to prove the following:}\\$
$$\text{Cov}[\left(W(t_{i+1})-W(t_i)\right),\left(W(t_{j+1})-W(t_j)\right)|Z]=
\begin{cases}
\left(t_{i+1}-t_i\right)-\frac{\displaystyle\left(t_{i+1}-t_i\right)^2}{\displaystyle T} &\text{if } i=j\\
-\frac{\displaystyle\left(t_{i+1}-t_i\right)\left(t_{j+1}-t_j\right)}{\displaystyle T} &\text{otherwise}
\end{cases}\\
$$ where $ W(t)$ is a standard Brownian motion and $Z=\frac{W_T}{\sqrt{T}}$ is an $\mathcal{F}_T$-measurable standard Gaussian random variable.
My guess is that this could be done using some results for the Brownian bridge.
Also, I was searching for the answer to this question on this website and there are some hints you can find, but somehow I cannot come to the endpoint.
I would appreciate any help.
2026-04-01 05:03:34.1775019814
Covariance of Brownian bridge increments
1.2k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in STOCHASTIC-PROCESSES
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
- Probability being in the same state
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Why does there exists a random variable $x^n(t,\omega')$ such that $x_{k_r}^n$ converges to it
- Compute the covariance of $W_t$ and $B_t=\int_0^t\mathrm{sgn}(W)dW$, for a Brownian motion $W$
- Why has $\sup_{s \in (0,t)} B_s$ the same distribution as $\sup_{s \in (0,t)} B_s-B_t$ for a Brownian motion $(B_t)_{t \geq 0}$?
- What is the name of the operation where a sequence of RV's form the parameters for the subsequent one?
- Markov property vs. transition function
- Variance of the integral of a stochastic process multiplied by a weighting function
Related Questions in BROWNIAN-MOTION
- Compute the covariance of $W_t$ and $B_t=\int_0^t\mathrm{sgn}(W)dW$, for a Brownian motion $W$
- Why has $\sup_{s \in (0,t)} B_s$ the same distribution as $\sup_{s \in (0,t)} B_s-B_t$ for a Brownian motion $(B_t)_{t \geq 0}$?
- Identity related to Brownian motion
- 4th moment of a Wiener stochastic integral?
- Optional Stopping Theorem for martingales
- Discontinuous Brownian Motion
- Sample path of Brownian motion Hölder continuous?
- Polar Brownian motion not recovering polar Laplacian?
- Uniqueness of the parameters of an Ito process, given initial and terminal conditions
- $dX_t=\alpha X_t \,dt + \sqrt{X_t} \,dW_t, $ with $X_0=x_0,\,\alpha,\sigma>0.$ Compute $E[X_t] $ and $E[Y]$ for $Y=\lim_{t\to\infty}e^{-\alpha t}X_t$
Related Questions in COVARIANCE
- Let $X, Y$ be random variables. Then: $1.$ If $X, Y$ are independent and ...
- Correct formula for calculation covariances
- How do I calculate if 2 stocks are negatively correlated?
- Change order of eigenvalues and correspoding eigenvector
- Compute the variance of $S = \sum\limits_{i = 1}^N X_i$, what did I do wrong?
- Bounding $\text{Var}[X+Y]$ as a function of $\text{Var}[X]+\text{Var}[Y]$
- covariance matrix for two vector-valued time series
- Calculating the Mean and Autocovariance Function of a Piecewise Time Series
- Find the covariance of a brownian motion.
- Autocovariance of a Sinusodial Time Series
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Let $\widetilde{W}(t)=W(t)-\frac{t}TW(T)$, since $\{\widetilde{W}(t),\,0\le t\le T\}$ is a gaussian process and uncorrelated with $Z=\frac{W(T)}{\sqrt{T}}$, therefore $\{\widetilde{W}(t),\,0\le t\le T\}$ is independent with $Z$ too and \begin{gather} \mathsf{E}[\widetilde{W}(t)|Z]=0,\quad \mathsf{E}[\Delta\widetilde{W}(t_i)|Z]\stackrel{\text{def}}=\mathsf{E}[\widetilde{W}(t_{i+1})-\widetilde{W}(t_i)|Z]=0, \quad \mathsf{E}[\Delta W(t_i)|Z]=\frac{\Delta t_i}{\sqrt{T}}Z.\\ \mathsf{E}[\widetilde{W}(s)\widetilde{W}(t)]=\mathsf{E}[W(s)W(t)]-\frac{st}{T^2}\mathsf{E}[W^2(T)]=s\wedge t-\frac{st}T.\\ \begin{aligned} \mathsf{cov}[\Delta W(t_i),\Delta W(t_j)|Z] &=\mathsf{cov}[\Delta\widetilde{W}(t_i),\Delta\widetilde{W}(t_j)|Z] =\mathsf{E}[\Delta\widetilde{W}(t_i)\Delta\widetilde{W}(t_j)]\\ &=\begin{cases} \Delta t_i-\dfrac{(\Delta t_i)^2}{T}, & i=j,\\ -\dfrac{\Delta t_i\Delta t_j}{T}, &i\ne j. \end{cases} \end{aligned} \end{gather}
Some complements of above expressions: 1. Since $\mathsf{E}[W(T)]=0$ and $$\mathsf{E}[\widetilde{W}(t)]=\mathsf{E}\Bigl[W(t)-\frac{t}TW(T)\Bigl]=\mathsf{E}[W(t)]-\frac{t}T\mathsf{E}[W(T)]=0,$$ then $$\mathsf{cov}[\widetilde{W}(t),Z]=\mathsf{E}[\widetilde{W}(t)Z]-\mathsf{E}[\widetilde{W}(t)]\mathsf{E}{Z}=\mathsf{E}[W(t)Z]-\frac{t}{\sqrt{T}}\mathsf{E}[Z^2] =\frac{t}{\sqrt{T}}-\frac{t}{\sqrt{T}}=0.$$ Therefore, $\{\widetilde{W}(t),0\le t\le T\}$ and $Z$ are uncorrelated. Meanwhile, $\{\widetilde{W}(t),0\le t\le T,Z=\frac{W(T)}{\sqrt{T}}\}$ is Gaussian, So $\{\widetilde{W}(t),0\le t\le T\}$ and $Z$ are independent too.
4. Since $\Delta W(t_i)-\mathsf{E}(\Delta W(t_i)|Z)=\Delta W(t_i)-\frac{\Delta t_i}{\sqrt{T}}Z=\Delta\widetilde{W}(t_i)$, therefore \begin{align} \mathsf{cov}[\Delta W(t_i),\Delta W(t_j)|Z] &=\mathsf{E}[(\Delta W(t_i)-\mathsf{E}(\Delta W(t_i)|Z))(\Delta W(t_j)-\mathsf{E}(\Delta W(t_j)|Z))|Z]\\ &=\mathsf{E}\Bigl[\Bigl(\Delta W(t_i)-\frac{\Delta t_i}{\sqrt{T}}Z\Bigr)\Bigl(\Delta W(t_j)-\frac{\Delta t_j}{\sqrt{T}}Z\Bigr)\Bigr] \\ &=\mathsf{E}[\Delta\widetilde{W}(t_i)\Delta\widetilde{W}(t_j)|Z]=\mathsf{E}[\Delta\widetilde{W}(t_i)\Delta\widetilde{W}(t_j)]. \end{align} Similarly, since $\mathsf{E}[\Delta\widetilde{W}(t_i)|Z]=0$ and $$ \mathsf{cov}[\Delta\widetilde W(t_i),\Delta\widetilde W(t_j)|Z]=\mathsf{E}[\Delta\widetilde W(t_i) \Delta\widetilde W(t_j)|Z]=\mathsf{E}[\Delta\widetilde W(t_i) \Delta\widetilde W(t_j)]. $$ 5. For $i<j$ (i.e. $i+1\le j$), \begin{align} \mathsf{E}[&\Delta\widetilde{W}(t_i)\Delta\widetilde{W}(t_j)] =\mathsf{E}[(\widetilde{W}(t_{i+1})-\widetilde{W}(t_i))(\widetilde{W}(t_{j+1})-\widetilde{W}(t_j)]\\ &=(t_{i+1}\wedge t_{j+1}-t_i\wedge t_{j+1})-\frac{(t_{i+1}-t_i)t_{j+1}}{T} -(t_{i+1}\wedge t_j -t_i\wedge t_j)-\frac{(t_{i+1}-t_i)t_{j}}{T}\\ &=(t_{i+1}-t_i)-(t_{i+1}-t_i)-\frac{(t_{i+1}-t_i)(t_{j+1}-t_j)}{T}\\ &=-\frac{\Delta t_i\Delta t_j}T. \end{align}