What's the explicit difference when looking at Differential Equations vs Stochastic Differential Equations?

89 Views Asked by At

I learned differential equations from mathematics text and have having some trouble with its application in statistics. Specifically applied to stochastic time series.

From my time series text reads "The general solution to a differential equation is defined to be a particular solution plus all homogenous solutions."

$\ y_{t}=Aa^{t}_{1}+\dfrac {a_{0}}{\left( 1-a_{1}\right) }+\sum ^{\infty }_{i=0}a^{i}_{1}\varepsilon _{t-1} $

I'm familiar with the first two expressions of the equation. If homogenous is taken to be the same as the complementary solution -- my understanding is there's a "complementary function". This function never had a random disturbance parameter. So with this added I was hoping there could be a simple explanation without needing the derivation.

The notion of 'all homogenous equations' is also lacking some understanding.

Original Equation: $\ y_{t}=a_{0}+a_{1}y_{t-1}+\varepsilon _{t} $

$\ y_{t}=a_{0}\sum ^{t-1}_{i=0}a_{i}i+a^{t}_{1}y_{0}+\sum ^{t-1}_{i=0}a^{i}_{i}\varepsilon _{t-i} $

$\ y_{t}=a_{0}\sum ^{t-1}_{i=0}a^{i}_{1}+a^{t}_{1}\left( a_{0}+a_{1}y+\varepsilon _{0}\right) +\sum ^{t-1}_{i=0}a^{i}_{1}\varepsilon _{t-i} $

$\ y_{t}=a_{0}\sum ^{t+m}_{i=0}a^{i}_{1}+\sum ^{t}_{i=0}a^{i}_{1}\varepsilon _{t-i}+a^{t+m+1}_{1}y _{-m-1} $

$\ y_{t}=\dfrac {a_{0}}{1-a_{1}}+\sum ^{\infty }_{i=0}a^{i}_{1}\varepsilon _{t-i}$

With the assumption of unknown initial conditions the first term in the equation reduces to the Aa1 term.

1

There are 1 best solutions below

7
On BEST ANSWER

It is the usual argument for any linear equation, if $y_t$ and $z_t$ are two different solutions, then by subtracting the recursion equations you get $$ (y_t-z_t)=a_1(y_{t-1}-z_{t-1}) $$ which means that the difference of the two inhomogeneous solutions is a homogeneous solution. In the other way, any sum of inhomogeneous solution and homogeneous solution is again an inhomogeneous solution.

If $L$ is a linear operator, then $L(y)=b$ and $L(z)=b$ implies $L(y-z)=0$, and if $L(v)=0$ then $L(y+v)=L(y)+L(v)=b+0$.


In your case with a stochastic perturbation you have to care especially for convergence if you go the way I did in the comment. However, you can come from the side of the desired result and define $e_t=\sum_{j=0}^\infty a_1^j ε_{t−j}$ which can be proven to converge almost everywhere so that $e_t=a_1e_{t-1}+ε_t$ and in the difference of these equations $$ (y_t-e_t)=a_0+a_1(y_{t-1}-e_{t-1}) $$ you get a classical recursion equation $x_t=a_0+a_1x_{t-1}$ without any stochastic terms. As this has the known solution $x_t=Ca_1^t+\frac{a_0}{1-a_1}$ one can reconstruct the solution from this.