I have been trying hard to understand this topic, but only failing.Reading through my lecture notes and online videos about stochastic integration but I just can't wrap my head around it. The main reason is, the notation/terminologies. I find them very very confusing and vague; some seem to interchange one another, some don't, for example "Ito integration" and "stochastic integration." Are they the same thing? Different?
I understand my question is somewhat long, so, if tedious, please skip the bit trying to make me understand definitions and just answer, with detail, the extracted worked example I am having trouble with below
My lecture notes basically just throws things at me out of the blue saying "here's the definition. Stick with it, so here's a worked example" which I don't follow at all even with the definition.
First, here is what my notes define as a "simple process"
Stochastic pocess $(g_t)_{t \geq0}$ is said to be simple if it has the following form $g_t(\omega)=\sum_{i=0}^\infty g_i(\omega)1_{[t_i,t_{i+1})}(t)$. Where $g_i$ are random variables adapted up to time $t_i$.
And "the stochastic integral"
$X_t = \int_0^tg_sdW_s := \sum_{i=0}^\infty g_i \times(W_{t_{i+1}\cdot t}-W_{t_i\cdot t})$ where $a\cdot b$ is $min\{a,b\}$.
Here already, I get slightly confused; $X_t$ is denoted as the stochastic integral, but $X_t=X(t)$ as I understand, so this is ALSO a stochastic process itself? And what is $dW_s$? What is this variable $s$? Is it a time variable? Then doesn't it make $W_s$ a function(of $s$)? Then, unlike $dx$, $dy$, I just feel very uneasy with $dW_s$. Like, I can't express this deinition in words sensibly; "$X_t$ is a stochastic process which is an integral of a simple process $g_t$ with respect to Brownian motion $W_s$ over $0,t$....."? But what is $\omega $ in $g_t(\omega)$ in the first definition then? $g_t$ isn't a function of $W_s$, it's a function of $\omega$(whatever it is).
So these are the things that just keep going in my brain when I see this. Now here's a worked example which I am very lost with.
Find $\int_0^T W_sdW_s$ where $W_s$ is assumed to be a Brownian motion.
Here's the work interrupted with my voiceover
The Ito integrable process $g_s$ can be approximated by taking a partition o $[0,T]$ given $0=t_0<t_1<...<t_n=T$. Let $t_i-t_{i-1}=\frac{T}{n}$ and defining a simple process $g_t^n= \sum_{i=0}^{n-1} g_{t_i}1_{[t_i,t_{i+1})}(t)$.
Q. I'll accept that it "can be approximated" as a fact. But what does $g_t^n$ mean? $g_t=g(t)$ to the power of $n$? How is this in any way equal to the simple process defined above? And why is this up to $n-1$ and not $n$ or $\infty$?
Where $t_i=i\frac{T}{n}$. We will always sum from $0$ to $n-1$.
Q. Again for what reason?
Let $\Delta W_i=W_{t_{i+1}}-W_{t_i}$. then the integral is $n \rightarrow \infty$ of $\sum_iW_{t_i} \Delta W_i$.
Q. I am trying to check here; According to the definition of stochastic process integration, in this case $g_i$ is replaced with $W_{t_i}$...? Though why $t_i$ and not $i$ as in the definition?
We begin by observing that $W_T^2=\sum_i(W_{t_{i+1}}^2-W_{t_i}^2)$
Q. Why $W_T^2$ in particular? Where did this "square" come out of? And is this $T$...$\infty$? Because that is where we're tending $n$ to, yes? And why does this equal the RHS? I am having trouble algebraically showing that this is equal.
using the identity $(a-b)b=\frac{1}{2}(a^2-b^2-(a-b)^2)$, we obtain, $\sum_i(W_{t_{i+1}}-W_{t_i})^2+2 \sum_i W_{t_i}(W_{t_{i+1}}-W_{t_i})$.
And the rest follows if I understand what is up there. And this question is getting too long so I will stop here. Can someone please help me understand? Thank you very much
Since you have lots of questions, my answer will be even longer than yours :). (Throughout my answer I'll use $a \wedge b := \min\{a,b\}$ to denote the minimum of $a$ and $b$.)
Actually Itô integration is a particular form of stochastic integration. There are also other ways to define stochastic integrals. However, in some sense, the Itô integral is THE stochastic integral (in the sense that is (one of) the most important one(s)).
Yes, that's correct. For each fixed $t \geq 0$,
$$X_t(\omega) = \left( \int_0^t g_s \, dW_s \right)(\omega)$$
is a random variable and the family of random variables $(X_t)_{t \geq 0}$ is a stochastic process.
Well, first of all it's simply a notation (yeah, I know that you don't like to hear that). We define $$\int_0^t g_s \, dW_s := \sum_{i} g_i (W_{t_{i+1} \wedge t}-W_{t_i \wedge t}), \tag{1}$$ so the left-hand side is just a notation we have introduced to shorten things.
Yes. If you want to get a little bit more comfortable with this, then have a look at Riemann-Stieltjes integrals; these are integrals of the form
$$\int_0^t h(s) \, df(s)$$
for "nice" functions $h$ where $f:[0,\infty) \to \mathbb{R}$ is a function of bounded variation (this is all deterministic, no dependence on $\omega$!); in particular for step functions of the form
$$h(t) = \sum_{i=1}^n c_i 1_{[t_i,t_{i+1})}(t)$$
this integral is (defined as)
$$\int_0^t h(s) \, df(s) = \sum_{i=1}^n c_i (f(t_{i+1} \wedge t) - f(t_i \wedge t)).$$
For $f(t) = t$ this yields the standard Riemann integral. On the other hand, if we formally plug in $f(t) := W_t(\omega)$ and $h(t) :=g(t,\omega)$ for fixed $\omega$ this gives $(1)$. (Note: This is just a motivation why we define the integral $(1)$ as we do it. The Itô integral is not a Riemann-Stieltjes integral.)
Well, hopefully you do know the basics of probability theory...? $\Omega$ denotes the underlying probability space and, as usual, $\omega \in \Omega$ is an element of this space. It models the randomness. (Note that $g_t$ itself is again a stochastic process.)
No, this is not the power; just a notation. As it is written there we define
$$g_t^n(\omega) := \sum_{i=0}^{n-1} g_{t_i} 1_{[t_i,t_{i+1})}(t), \tag{2}$$
i.e. we use the notation $g_t^n$ to denote the simple process defined by (2). If you are confused by this, then use always the notation $g(t)$ instead of $g_t$, because then we can define
$$g_n(t,\omega) := \sum_{i=0}^{n-1} g_{t_i} 1_{[t_i,t_{i+1})}(t). \tag{3} $$
(I hope you are not even more confused. Basically, the trouble is that we have to put the index $n$ somewhere and if we use the lower index for the time, then the only remaining possibility is to use the upper index.)
We are interested in the stochastic integral $\int_0^T W_s \,dW_s$, right? So we want to define an approximation of the function $g(t,\omega) := W_t(\omega)$ on $[0,T]$. If you consider the intervals $[t_i,t_{i+1})$, $i=0,\ldots,n-1$, then you see that they cover the interval $[0,T]$. That's why it is chosen in this way.
You are confusing several things, the $g$ you have in this example and the $g$ from your definition of the Itô integral. Let me just rewrite the definition and then you'll understand. Our definition states that if $h$ is a simple process of the form
$$h(t,\omega) = \sum_{i \geq 0} h_i(\omega) 1_{[t_i,t_{i+1})}(t) \tag{4}$$
then
$$\int_0^t h(s) \, dW(s) := \sum_{i \geq 0} h_i (W_{t_{i+1} \wedge t}-W_{t_i \wedge t}). \tag{5}$$
Now, for fixed $n \in \mathbb{N}$, our approximating simple process $g_t^n$ (see $(3)$) is of the form $(4)$ where $h_i := g_{t_i} = W_{t_i}$. By our definition $(5)$ this gives the claimed result.
No, $T$ is not $\infty$! Right at the beginning of your example you stated your problem:
and here $T>0$ is some fixed real number. This is the same fixed number which keeps popping up throughout the whole proof.
That's just a telescoping sum.
$$\begin{align*} \sum_{i=0}^{n-1} (W_{t_{i+1}}^2-W_{t_i}^2) &= (W_{t_1}^2-W_{t_0}^2) + (W_{t_2}^2-W_{t_1}^2) + \ldots+ (W_{t_n}^2-W_{t_{n-1}}^2) \\ &= W_{t_{n}}^2 - W_{t_0}^2 = W_T^2 \end{align*}$$
since $t_n = T$ and $W_{t_0} = W_0 = 0$ (because $(W_t)_t$ is a Brownian motion).