I am trying to get a more intuitive understanding of what I call the $\textbf{covariation}$ between 2 stochastic processes. Let $(X_t)_t$ and $(Y_t)_t$ be 2 stochastic processes and $\Delta$ a subdivision of $\mathbb{R}^+$. The covariation between $X$ and $Y$ is: $$ \langle X, Y \rangle _t = \lim _{|\Delta| \to 0} \sum _{k \in \mathbb{N}} (X_{t_{k+1 \land t}} - X_{t_{k\land t}}) (Y_{t_{k+1 \land t}} - Y_{t_{k\land t}}) $$ where $t_k \land t = min(t_k, t)$, $|\Delta| = sup_{\{k \in \mathbb{N}\}}(t_{k+1} - t_k)$, and $\lim$ refers to convergence in probability. In my mind, $X, Y$ can be any processes (or we can limit to Itô processes). I know that if $(B_t^{(1)})_t$ and $(B_t^{(2)})_t$ are 2 independant brownian motions, $\langle B^{(1)}, B^{(2)} \rangle _t =0$. My first question is simple:
1) Is that true for any 2 processes: i.e. if $X$, $Y$ are independant, can I deduce that:
$\forall t \geq 0, \langle X, Y \rangle _t =0$ ?
My second question is:
2) Assume that I know the process $\langle X, Y \rangle _t$. Can I say something about the process:
$\int _0^t (X_s - \mathbb{E}(X_s)) (Y_s - \mathbb{E}(Y_s))ds$, or maybe just $\int _0^{+\infty} (X_s - \mathbb{E}(X_s)) (Y_s - \mathbb{E}(Y_s))ds$ ?
Thanks a lot!