I am reading some papers about the Realized Variance (RV) estimator. I read an example where the following setting is considered: $$\Delta X(t) \equiv X(t)-X(t-1) = \int_{t-1}^{t} \sigma(s) \, dW(s) = \sigma_t \left[W(t)-W(t-1) \right],$$ i.e. $\sigma(s)$ is equal to a constant $\sigma_t$ for each interval $(t-1,t].$ Note that$\{\sigma\}$ is a predictable, strictly positive and square integrable process.
The author says that $\sigma_t$ signifies the conditional standard deviation of $X(t)-X(t-1)$, and I am not understanding why.
I will try to explain better the source of my confusion. To me it seems that $\sigma_t$ should be the unconditional variance of $\Delta X(t)$, since: $$var[\Delta X(t)] = E\left[\sigma^2_t \left[W(t)-W(t-1) \right]^2 \right] = \sigma^2_t.$$ In fact, if $\sigma_t$ is the conditional variance, i.e. $var(\Delta X(t)| \mathcal{F}_{t-1}) = \sigma_t$, then at time $t-1$ we would be able to exactly predict the value of $\sigma_t$, which is the future variance of the interval $(t-1,t]$, since $t-1$ is excluded. The two statements below seems to be contradicting:
- $\{\sigma\}$ is a stochastic process which takes a constant random value for each interval $(t-1,t]$;
- the value that $\{\sigma\}$ takes on $(t-1,t]$ is known with certainty at time $t-1$.
I know that I am getting something wrong but I cannot figure out what it is.