I know that this is more a discussion than a math question, but I'd like to know. Computer engineering student here. When studying, I like to go through all of the book's definitions and proofs, and then finish all the exercises. Just did my limits exam. I'd like to do the same thing for the epsilon-delta chapter of Stewart's book, but right now I'm in a crunch for time, and we're starting derivatives. Should I just skip it?
Is it crucial for an engineering student to "master" the epsilon-delta definitions of limits?
676 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 3 best solutions below
On
Depends. I don't think my friends with PhDs in physics know how to do them. I never encountered them in my own physics classes and I'd think the math there was more thorough than in engineering.
That said, there are situations where it could give you and advantage. A Symmetric Derivative offers a better way to calculate the numerical derivative than the formal definition, however the S.D. exists in places where the actual derivative does not. There are probability distributions that make intuitive sense to add together that Delta-Epsilon proofs would make you doubt. Finite Element Analysis methods don't always converge, or at least not rapidly and delta-epsilon can tell you when that is.
Understanding delta-epsilon can help you avoid mistakes and will give you an advantage over not understanding them.
On
+1 for doing all the exercises. That said, if this is not in the actual curriculum for your course then the time crunch argument should prevail. Skip it now even though it's in the book.
In the longer run, you might appreciate its elegance and power. The underlying idea is to validate useful informal everyday limit vocabulary like "approaches" by a formal structure that codifies the idea of "infinitesimally close" using infinitely many increasingly restrictive inequalities.
Come back to $\epsilon-\delta$ when you have time - perhaps when the semester ends. Then if you see it in more advance courses it will be for the second time.
It helps.
An engineer never has to understand those concepts as well as a mathematician, but they always help. I will answer this specifically for electrical and electronics engineering.
I am an electrical and electronics engineer and I was involved with teaching too. I can safely say they come around quite a few times in the upcoming years. You will see them again for differential equations, for Fourier Transforms, for feedback systems and for information theory, while the last one is usually a grad level course.
What happens is that each time you gave up something and memorise it as it is, it comes back and haunts you next years, so either you have to exert more effort to cover your weaknesses up, or you memorise more stuff as they are. After a certain point, especially if you do something heavily theoretical, you realise you cannot learn anything anymore and memorise whatever is in front of you.
My suggestion: Don't leave too many blank spots in your first year. There will be other stuff that you will give up learning in the future.