Suppose that I divide the time interval $[0,T]$ into $n$ subintervals with end points $t_k = \frac{k}{n}T$ for $k=0,1,\cdots,n$ and I consider the following process $X^n_t = X^n_{t_k}$ if $t_k < t < t_{k+1}$ where \begin{equation} X_{t_{k+1}} = \begin{cases} X^n_{t_k} + \delta X^n_{t_k}(1-X^n_{t_k}), &\qquad \text{with probability $\frac{1}{2}$}\\ X^n_{t_k} - \delta X^n_{t_k}(1-X^n_{t_k}), &\qquad \text{with probability $\frac{1}{2}$} \end{cases} \end{equation} for some constant $\delta > 0$ and an initial condition $X^n_0 \in [0,1]$. If I set $\delta = \sqrt{\frac{T}{n}}$ and send $n\rightarrow +\infty$, would the process $\{X^n_t\}$ converges (in distribution, or any sense) to the continuous time process \begin{equation} X_T = X_0 + \int_0^T X_t(1-X_t)dW_t? \end{equation} Where $W_t$ denotes the standard Brownian motion. From my understanding, if $X^n_{t_{k+1}} = X^n_{t_k} + \delta$ or $X^n_{t_{k+1}} = X^n_{t_k} - \delta$ with probability $\frac{1}{2}$ each then the limit would be the standard Brownian motion itself. But I'm not sure how to do it in the case where we are getting a SDE.
2026-03-31 17:47:57.1774979277
Continuous limit of a discrete stochastic process
361 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in STOCHASTIC-PROCESSES
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
- Probability being in the same state
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Why does there exists a random variable $x^n(t,\omega')$ such that $x_{k_r}^n$ converges to it
- Compute the covariance of $W_t$ and $B_t=\int_0^t\mathrm{sgn}(W)dW$, for a Brownian motion $W$
- Why has $\sup_{s \in (0,t)} B_s$ the same distribution as $\sup_{s \in (0,t)} B_s-B_t$ for a Brownian motion $(B_t)_{t \geq 0}$?
- What is the name of the operation where a sequence of RV's form the parameters for the subsequent one?
- Markov property vs. transition function
- Variance of the integral of a stochastic process multiplied by a weighting function
Related Questions in STOCHASTIC-DIFFERENTIAL-EQUATIONS
- Polar Brownian motion not recovering polar Laplacian?
- Uniqueness of the parameters of an Ito process, given initial and terminal conditions
- $dX_t=\alpha X_t \,dt + \sqrt{X_t} \,dW_t, $ with $X_0=x_0,\,\alpha,\sigma>0.$ Compute $E[X_t] $ and $E[Y]$ for $Y=\lim_{t\to\infty}e^{-\alpha t}X_t$
- Initial Distribution of Stochastic Differential Equations
- (In)dependence of solutions to certain SDEs
- Expectation, supremum and convergence.
- Integral of a sum dependent on the variable of integration
- Solving of enhanced Hull-White $dX_t = \frac{e^t-X_t}{t-2}dt + tdW_t$
- Closed form of a SDE
- Matricial form of multidimensional GBM
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
If $\delta^n = \sqrt{\frac{T}{n}}$, then $\left\{\left(X^n_t\right)_{t\geq 0}\right\}_{n\in \mathbb{N}}$ converges in law, also know as convergence in distribution, to a continuous-time process satisfying $$ X_t = X_0 + \int_0^t X_s \left(1 - X_s\right)dW_s.$$
We can show this by a classic identification of limit, tightness argument. For the details of this style of argument see Billingsley's Convergence of Probability Measures. The idea is that when a sequence of random variables is tight then the sequence admits a convergent subsequence. If we can identify a unique limit point for a subsequence under the assumption that it converges, we may conclude that the sequence converges. The convergence follows from a simple result from analysis: if a sequence in metric space is such that every subsequence admits a convergent subsubsequence and each of these convergent subsubsequences converges to the same point, then the sequence converges to that point.
Let $\mathbb{D}_T := \mathbb{D}_T\left(\left[0,T\right], \mathbb{R}\right)$ be the Skorohod space of right continuous real functions on $\left[0, T\right]$, see section 12 of Billingsley. In what follows, we write $\Delta t^n := \frac{T}{n}$ and $\delta^n := \sqrt{\frac{T}{n}}$.
For the identification of the limit, we have that the $X^n$'s and $X$ are Markov processes and $X$ is characterised by its generator $\mathcal{L} = \frac{1}{2}\left(x\left(1-x\right)\right)^2 \partial_x^2$, with domain $C^2\left(\left[0,1\right]\right)$. The fact that $X$ is characterised by its generator can be deduced from either general results on the martingale problem, see for example theorem 4.1 in Ethier and Kurtz's Markov Processes, or from the fact that the SDE satisfied by $X$ has Lipschitz coefficients and therefore admits a unique strong solution. We denote by $P_n$ the transition kernel of $X^n$ given by $$P_nf(x) = \mathbb{E}\left[f\left(X_{t_1}^n\right)\left| X_0 = x\right.\right], $$ for functions $f$ of a suitable class (we will take that class to be $C^2\left(\left[0,1\right]\right)$). We will show that the discrete generators of $X^n$, $$\mathcal{L}_n = \frac{P - I}{\Delta t^n}$$ where $I$ is the identity operator (i.e. $If = f$ for any function), converges to $\mathcal{L}$. Let $f \in C^2\left(\left[0,1\right]\right)$, \begin{gather*} P^nf(x) = \mathbb{E}\left[f\left(X_{t_1}^n\right)\left| X_0^n = x \right.\right] = \frac{1}{2} f\left(x + \delta^n x\left(1-x\right)\right) + \frac{1}{2}f\left(x - \delta^n x\left(1-x\right)\right)\\ =\frac{1}{2} \left[f(x) + f'\left(x\right)\left(\delta^n x(1-x)\right) + \frac{1}{2}f''(x)\left(\delta^n x(1-x)\right)^2 + f(x) - f'(x) \left(\delta^nx(1-x)\right) + \frac{1}{2}f''(x) \left(\delta^nx(1-x)\right)^2\right] + O\left(\left(\Delta t^n\right)^{3/2}\right)\\ = f(x) + \Delta t^n\frac{1}{2}f''(x) \left(x(1-x)\right)^2 + O\left(\left(\Delta t^n\right)^{3/2}\right). \end{gather*} This entails that for $f\in C^2\left(\left[0,1\right]\right)$, $$ \mathcal{L}_nf(x) = \left(x(1-x)\right)^2\frac{1}{2}f''(x) + O\left(\left(\Delta t^n\right)^{1/2}\right),$$ and $\mathcal{L}_n f \to \mathcal{L}f$ as $n \to \infty$, which is what we wanted to show.
For tightness, we will use a standard criterion, see section 13 of Billingsley. For $x\in \mathbb{D}_T$ and $A \subset \left[0,T\right]$, define $\omega\left(x, A\right) = \sup_{s,t \in A}\left|x(s) - x(t)\right|$ and for $\eta > 0$, $$\omega'\left(x, \eta\right) = \inf\left\{\max_{0\leq i \leq r-1}\omega\left(x, \left[t_i, t_{i+1}\right[\right): r \in \mathbb{N}^*, \; 0=t_0 < t_1< \dots < t_r = T, \; t_i - t_{i-1} \geq \eta \; \forall i\right\}.$$ A sequence of random variables $\left(Y_n\right)_{n\geq 1}$ is tight in $\mathbb{D}_T$ iff
For the first point, as for each $n$ $X_0^n \in \left[0,1\right]$ implies that $X_{t_k^n}^n \in \left[0,1\right]$ for all $k = 1, \dots, n$, the sequence $\left\{\sup_{t \in \left[0, T\right]}\left|X^n_t\right|\right\}_{n\in \mathbb{N}^*}$ is bounded and therefore tight.
For the second, we will control $\limsup_{n\to\infty}\mathbb{P}\left(\omega'\left(X^n, \delta^m\right) \geq \varepsilon_1\right)$. We fix a specific interval $\left[t_i^m, t_{i+1}^m\right[$. Let $n > m$ and $j$ such that $t_j^n \in \left[t_i^m, t_{i+1}^m\right[$, then for $k$ such that $t_{j+k}^n \in \left[t_i^m, t_{i+1}^m\right[$ $$X_{t_{j + k}^n}^n = X_{t_i^n}^n + \sum_{j = 1}^k Y_j \delta^n X_{t_{i - 1 + j}^n}\left(1 - X_{t_{i - 1 + j}^n}\right),$$ where $\left(Y_j\right)$ is an iid sequence such that $\mathbb{P}\left(Y_1 = 1\right) = \mathbb{P}\left(Y_1 = -1\right) = \frac{1}{2}$. Furthermore, we have $$ \left|X_{t_{j + k}^n}^n - X_{t_i^n}^n\right| \leq \frac{1}{4}\delta^n\left|\sum_{j = 1}^{\left\lfloor \frac{n}{m}\right\rfloor} Y_j\right|.$$ The $\frac{1}{4}$ is the maximum of $x \to x(1-x)$ over $\left[0,1\right]$. The bound on right hand side uniformly bounds $\left|X_t^n - X_s^n\right|$ for any $s, t \in \left[t_i^m, t_{i+1}^m\right[$. By the central limit theorem, for $n$ large enough $$ \frac{1}{4}\left|\sum_{j = 1}^{\left\lfloor \frac{n}{m}\right\rfloor} Y_j\right| = \frac{\sqrt{T}}{4\sqrt{m}}\left|\sum_{j = 1}^{\left\lfloor \frac{n}{m}\right\rfloor} \frac{Y_j}{\sqrt{\frac{n}{m}}}\right| \approx \frac{\sqrt{T}}{8 \sqrt{m}} \left|G\right|,$$ where $G \sim \mathcal{N}\left(0,1\right)$. Thus with a union bound, we have that $$\limsup_{n \to \infty}\mathbb{P}\left(\omega'\left(X^n, \delta^m\right) \geq \varepsilon_1\right) \leq \limsup_{n \to \infty}\mathbb{P}\left(\max_{0 \leq i \leq m - 1}\omega\left(X^n, \left[t^m_i, t^m_{i+1}\right[\right) \geq \varepsilon_1 \right) \leq m \mathbb{P}\left(\left|G\right| \geq \frac{8\sqrt{m}}{\sqrt{T}}\varepsilon_1\right).$$ We can make the right hand side as small as we want by taking $m$ large enough, which shows the second point. With that we have proved that $X^n$ converges to $X$, the solution of $$X_t = X_0 + \int_0^t X_s \left(1- X_s\right)dW_s. $$