Dynamical systems, causality and derivatives order

359 Views Asked by At

Talking about input/output representation of a dynamical system, the professor said that the equation(s) involved must satisfy this condition in order for the system to be qualified as "causal":

the greatest derivatives order of the output should be lower than the greatest derivatives order of the input

He explained the fact saying that if we dismiss the condition, we could imagine a system described by

$$y(t) = \dot{u}(t)$$

that is not feasible in our physical world where actions produce effects. He said "if we knew the derivative of input, we'd know the future".

At first, I can't fully understand this assertion. If it is true, then why does the very knowledge of u(t) not imply the knowledge of the future? I think this misunderstanding is due to the fact that I have just started studying system, so maybe i'm not in the correct perspective.

However there's another problem yet: in the quoted condition there is not written " derivatives of input are not allowed". They can appear, given that their max order is lower than (..). Why this constraint implies causality?

Finally, I'd be very grateful if you could point out main concepts that link together systems, derivatives, future.

Thanks a lot

4

There are 4 best solutions below

1
On

Taking your professor's example:

$$y(t) = \dot{u}(t) \implies y = \lim_{\delta \to 0} \frac{u(t+\delta)-u(t)}{\delta}$$

If we assume we can't know $u(t+\delta)$ for $\delta > 0$ (can't know the future), then this limit cannot be calculated in an applied setting.

2
On

consider the discrete time evolution of the function $$u_n = u(t_0 + n\Delta t)$$ and its derivative $$ \dot u = \frac{u_{n+1} - u_n}{\Delta_t} $$, where n is the current time step, and the denominator is the infinitesimal time step. Clearly, it involves knowing the function u(t) at step n+1, which is in the future. At present, we know u(t) for all n from 0 to n

Edit:

To answer your 2nd question: Consider any law of physics. I don't remember who was the first to make this proof , it may be Landau or Noether, but the statement was that all laws of physics must be fixing the same derivative order. In the case of our universe, it seems to be the 2nd order. Short googling finds ample modern proofs, e.g.

http://www.academia.edu/2995322/Why_Physics_Uses_Second_Derivatives

Consider an apple falling on Newton's head. In the first case, we know the force of gravity, which fixes the 2nd time derivative, and thus gives us 2nd order ODE. We solve that ODE to find its velocity and position as function of time for some initial conditions. Given the above discrete representation of derivatives, we can model the falling of an apple as a step-wise process, where the laws of physics will always tell us acceleration at step n, and that in turn will tell us the velocity and position at step n+1, given velocity and position at step n.

Now, in the 2nd case, lets say that the laws of physics tell us what the velocity should be. We formulate the 1st order ODE, solve it, and find the position as function of time. Again we can formulate the step-by step process for velocity and position. However, now somebody comes and tells us that they know all velocities at steps 0 to n, and want to know the current acceleration at step n. We will fail to answer that question, as acceleration at step n depends on the future velocity at step n+1, which we do not know.

Hope this helps :)

2
On

I don't agree with this, at least not with the specific example given. If $u$ is differentiable at $t$, then $$\lim_{\delta \to 0} \frac{u(t+\delta)-u(t)}{\delta} = \lim_{\delta \to 0^-} \frac{u(t+\delta)-u(t)}{\delta},$$ i.e. we are approaching the limit from left and hence only consider time up to $t$.

My definition of causality would be $u_1(t) = u_2(t)$ for $t < t_0$ $\Rightarrow$ $y_1(t) = y_2(t)$ for $t < t_0$. This is equivalent to $u(t) = 0$ for $t < t_0$ $\Rightarrow$ $y(t) = 0$ for $t < t_0.$ Clearly the system $y(t) = \dot{u}(t)$ has this property.

I will add however that the system is unphysical for a different reason, namely that it is unstable $\left( \right.$consider for example $u(t) = e^{-t^2}$ $\left)\right.$

0
On

I've been asking myself the same question because the topic was introduced the same way in our lecture exercises. This is an old question, but I believe it's still relevant, and it also hasn't been closed.

I saw a lot of explanations similar to the ones of Alyosha and user237392 in other forums.

However, this condition for the derivative order in the context of causality does not seem to be a thing anymore in current literature, in almost all books (11 out of 10) I've found on system theory/ control theory, the condition

the greatest derivatives order of the output should be lower than the greatest derivatives order of the input

is not even in the context of causality but instead given as a condition for technical realizability. The reasons given are that real differentiation would for example mean that the output to a step function would be the delta function, which would mean destruction/saturation of the system. Also the transfer function would be $G(s) = s$, so with $|G(j\omega)|$ being the amplification of a periodic input with frequency $\omega$, we would have

$|G(j\omega)| \rightarrow \infty$, for $\omega \rightarrow \infty$,

a property which no physical system has.

Personally, I agree with what @Étienne Bézout said, more explicitly:

Causality is defined as:
For inputs $u_1, u_2$ and system responses (outputs) $y_1, y_2$, the system is causal if

$u_1(t) = u_2(t)$ for $t \in [0, T]$ $\Rightarrow$ $y_1(t) = y_2(t)$ for $t \in [0,T]$

Now applying this to the Differentiator $ y(t) = \dot{u}(t)$:

If the derivative of a function exists, then the left and right limits coincide, and the limit is unique. So, you would be able to calculate the derivative without requiring future values of the input, approaching from the left side.

From the uniqueness it follows that

$u_1(t) = u_2(t)$ for $t \in [0, T]$ $\Rightarrow$ $\dot{u}_1(t) = \dot{u}_2(t)$ for $t \in [0, T]$ $\Rightarrow$ $y_1(t) = y_2(t)$ for $t \in [0,T]$.

The last implication follows from the fact that $y(t) = \dot{u}(t)$ per definition of the Differentiator.

If I didn't make an error in this line of reasoning, then the Differentiator satisfies the definition of causality, but the highest derivative of the input is higher than the highest derivative of the output.

Most of the answers that propose the opposite use the fact that the limits from both sides coincide to suggest that "you would be able to look into the future", but this is simply a result of the differentiability of the function, and does not imply a violation of the definition of causality above in any way I'm aware of.