Do other people use Strang's convention for integrals?

94 Views Asked by At

Gilbert Strang's very vivid textbook Differential Equations and Linear Algebra uses a convention for integrals which seems unusual to me. It is very handy in this context, though, and I would like to know if other people use it. In effect he adopts the convention that an integral $\int_{a}^{b}f(s)ds$ only goes forwards from $a$ to $b$. That is:

$$\int_{a}^{b}f(s)ds = \begin{cases} 0, & \mbox{if } b\leq a \\ F(b)-F(a) & \mbox{when } a\leq b\mbox{ and } \frac{dF}{ds}=f(s). \end{cases}$$

You can see this in his particular solution for the first order linear differential equation with shifted Heaviside function $H(t-T)$ as source. That equation is $$y'-ay\ =\ H(t-T).$$ Here $t$ is the usual time variable, and $T$ is a constant time. He gives the particular solution with initial value $y(0)=0$ as $$y_p\ =\ \int_{T}^{t}e^{-as}ds \ =\ \frac{1}{a}(e^{a(t-T)} - e^{-at}).$$ The second equality is right if $t>T$, but for $t<T$ we must use Strang's convention for integrals.

Do others use this convention?

I can add: This book takes a very quick and practical approach a lot of the time. It assumes without comment that (nearly) every function is real analytic, and while it points out some places where a solution goes to infinity it does not remark that this conflicts with assuming functions are limits of their Taylor series around arbitrary points. It gives no proofs of existence or uniqueness of solutions to ode's except by implicitly suggesting these follow from integrating functions term by term in their Taylor series.

But it is an extremely vivid book and I think it does well for a first course on ode's.

The convention suits Strang's purposes. You can see above it lets him write solutions more compactly. He is never interested in moving the time variable backwards, and he usually cares more for the long run behavior of solutions than for what they do before some given time $T$.

I think it would not be analysts who would use this convention (outside of Strang's textbook) but engineers or physicists.

1

There are 1 best solutions below

2
On

If I understand you correctly, this isn't just a convention, it's actually wrong. It should be true that $\int_b^a f(s) ds = - \int_a^b f(s) ds$. And this is a theorem, for we have as follows:

$$ 0 = \int_a^a f(s) \, ds = \int_a^b f(s) \, ds + \int_b^a f(s) \, ds $$

Just using the two basic properties of integration, for any $a, b$.

So something is very wrong here!

Now, you could define your integral so that this doesn't make sense - i.e. if you somehow built into your definition that integrals run forward, this wouldn't work. But simply taking the usual definition in terms of Riemann sums, one can prove the linearity properties needed to make the above argument work. This means you either have a bad convention on your hands, or there must be some alternative definition of the integral in use.