I am working on an exercise as follows:
Let $v_{1}, v_{2}:\mathbb{R}^{2}\longrightarrow\mathbb{R}^{2}$ be $C^{\infty}$ vector fields over $\mathbb{R}^{2}$. Let $(X_{n})_{n\in\mathbb{N}}$ be an i.i.d positive sequence, and set $S_{n}:=X_{1}+\cdots+X_{n}$. Define $N_{t}:=\sup\{n:S_{n}\leq t\}$.
Let $i(n)=1$ if $n$ is odd and $i(n)=2$ if n is even. Let $Z_{t}$ denote the solution of ODE $$\dfrac{d}{dt}Z_{t}=v_{i(N_{t})}(Z_{t}),\ Z_{0}=x.$$ Show that $(Z_{t})_{t\geq 0}$ is a stochastic process.
To show $(Z_{t})_{t\geq 0}$ is a stochastic process, we need to show that $Z_{t}$ is a random variable for each $t$. That is, if the probability space is $(\Omega,\mathcal{F},\mathbb{P})$, we need to show $$\{\omega:Z_{t}(\omega)\leq r\}\in\mathcal{F}\ \text{for all}\ r\in\mathbb{R}.$$
But how could I connect this with the ODE??
Thank you!
Edit 1:
Okay I figured it out. This is not a really straightforward proof. Also this exercise is really loosely stated. I will talk about the reformulation of this exercise, and what we need to assume.
Also, I need to clarify that, any argument using the probability theory here is not really useful, since we don't really know if $Z_{t}$ is a random variable or not. So anything written out will be purely functional analysis, or ODE analysis. The thing similar to random walk will not be a random walk since $Z_{t}$ may not be random at all, so it is also not really doable by using SLLN or something like that.
In addition, the differential equation in the exercise is not stochastic differential equation. I think it is called Random ODE, or something like that.
I will answer my own question to post my proof.
It is also really appreciated if anyone could post their proof, since the bounty will otherwise be wasted...
I believe there must be a better proof than mine :)
I believe that, to make the notation less confusing, $Z_{t}$ should be stated as the solution of the ODE as $$\dfrac{dx}{dt}=v_{i(N_{t})}(x),\ \ x(t_{0})=x_{0}.$$
The RHS of the above equation can be written as $g(x, N_{t}(\omega))$.
Note that since counting process is right continuous, in this proof, for the argument using continuity, we can always pass the argument over the rational number, conclude an almost surely result.
Therefore, let's just assume here $N_{t}(\omega)$ has continuous sample path. This can help us a lot since by hypothesis $v_{1}, v_{2}$ are both $C^{\infty}$ on $\mathbb{R}^{2}$, and therefore we can assume that $g$ is continuous in both $x$ and $t$.
Fix a sample path, i.e. $G(t,x):=g(x,N_{t}(\omega))$ and then we can consider the initial value problem (IVP) $$\dfrac{dx}{dt}=G(t,x),\ x(t_{0})=x_{0},$$ where $G$ is continuous.
In fact, the vector filed $G$ in the IVP also depends on $\omega$, so we should rewrite the IVP as $$\dfrac{dx}{dt}=G(\omega,t,x),\ x(t_{0})=x_{0}.$$ In particular, $G$ is measurable in $\omega$ and continuous in $(t,x)$, since $G(\omega, t,x):=g(x,N_{t}(\omega))$ and $g$ is continuous in both variables, and $N_{t}(\omega)$ is measurable in $\omega$ and has continuous sample path.
Proof:
Indeed, choose and fix a continuous function $x_{0}(t):[t_{0}, T]\longrightarrow\mathbb{R}$ such that $x_{0}(t_{0})=x_{0}$. Define a sequence of functions $x_{n}:[t_{0}, T]\times\Omega\longrightarrow\mathbb{R}$ by $$x_{n+1}(t,\omega):=x_{0}+\int_{t_{0}}^{t}G(\omega, s, x_{n}(s,\omega))ds.$$
Then it follows from Arzela Selection theorem (along with the equi-continuity) that $$x(t,\omega)=\lim_{n\rightarrow\infty}x_{n}(t,\omega)\ \text{for all}\ \omega\in\Omega.$$ (See any proof of Peano's Existence Theorem, the argument is similar).
Hence it suffices to prove that the mappings $x_{n}(t,\cdot):\Omega\longrightarrow\mathbb{R}^{d}$ are measurable for all $t\in [t_{0},T]$ and $n\in\mathbb{N}$, which can be done by induction.
Firstly, it is clear that the statement holds for $n=0$. Next, suppose that for some $n\in\mathbb{N}$ and all $t\in[t_{0}, T]$, the function $x_{n}(t,\cdot):\Omega\longrightarrow\mathbb{R}$ is measurable. Define $x_{n}^{(k)}:[t_{0}, t)\times\Omega\longrightarrow\mathbb{R}^{d}$ by $$x_{n}^{(k)}(s,\omega)=\sum_{i=0}^{k-1}\mathbb{1}_{[\frac{it}{k},\frac{(i+1)t}{k})}(s)\cdot x_{n}\Big(\dfrac{it}{k},\omega\Big),\ \text{for all}\ (s,\omega)\in [t_{0}, t)\times\Omega.$$
Using the fact that $G:\mathbb{R}^{d}\longrightarrow\mathbb{R}^{d}$ is a continuous function, we know that $$\lim_{k\rightarrow\infty}\int_{t_{0}}^{t}G(\omega, s, x_{n}^{(k)}(s,\omega))dx=\int_{t_{0}}^{t}G(\omega, s, x_{n}(s,\omega))ds.$$
Therefore, the mapping $\omega\mapsto \int_{t_{0}}^{t}G(s, x_{n}(s,\omega))ds$ is $\mathcal{B}(\mathbb{R})-$measurable for each $t\in[t_{0}, T]$, which implies that the function $x_{n+1}(t,\cdot)$ is $\mathcal{B}(\mathbb{R})-$measurable of each $t\in[t_{0}, T]$.
By induction, the mapping $x_{n}(t,\cdot)$ is $\mathcal{B}(\mathbb{R})-$measurable for all $t\in [t_{0}, T]$ and $n\in\mathbb{N}$.
Note that the above results also holds if the initial value is measurable, i.e., a random variable with values $x_{0}(\omega)$.