As a computer science major, should I take differential equations, linear algebra, or both?

10.1k Views Asked by At

I've taken three semesters of calculus (altho the way my school does it, Calc III excludes Green, Stokes, Divergence/Curl Theorems and puts them in a class called Calc IV), and I've met my math requirements; however, I know many CS grads take up to DiffEQ and/or Linear Algebra. Which of these is more useful for computing? And what precisely ARE the uses of these things in computing? If I don't ever study Green/Stokes, can I still do well in these?

2

There are 2 best solutions below

2
On

Linear algebra will be immensely useful if you intend on doing anything with computer graphics - matrices and vectors are the primary mathematical building blocks for graphics. Differential equations will be more useful if you're interested in modelling physical processes or populations. Personally, I'd consider linear algebra the more useful for a CS major.

Green's, Stokes, etc. aren't particularly important for either, if I recall correctly.

0
On

Linear algebra is immensely useful in a wide variety of areas. I'll list a few:

Discrete Geometry: Polyhedral theory, in particular, is an application of linear algebra and a beautiful theoretical discipline in its own right. One application of polyhedral theory is to optimization. If you have a mathematical program and the constraints are linear, then they form a polytope. Linear programming (when you are dealing with continuous variables) is computationally easy to solve. Mixed Integer-Linear Programming is NP-Hard, but common approaches to solve MILPs efficiently utilize cutting plane algorithms (more linear algebra).

Stochastic Processes: The applications of statistical and probability theory are as numerous as those of linear algebra. And nearly all of the sophisticated techniques require linear algebra. I will mention a few here: machine learning and data science, econometrics, and mathematical finance, random walks on graphs.

Differential Equations: I believe this has already been mentioned.

Algebraic Combinatorics: In particular, linear algebra is useful when dealing with generating functions. Laszlo Babai, the famous complexity theorist, also has a manuscript called The Linear Algebra Method which you can order from the University of Chicago. His manuscript includes numerous applications to theoretical computer science and discrete geometry.

Automata Theory: There is a linear algebra-esque algorithm to convert a DFA to a regular expression, known as the Brzozowski Algebraic Method. This algorithm is closely related to the notion of a walk on a graph (exponentiating the adjacency matrix) and using matrices to solve recurrences with generating functions.

Graph Theory: Spectral graph theory is a beautiful and active area of research. Vector spaces of graphs are also quite useful. By the way- the max-flow min-cut theorem from graph theory follows from the fact that the max-flow and min-cut problems are dual in the linear programming sense.

Algorithm Design: Are you familiar with Matroids? Various structures that satisfy a generalized notion of linear independence are susceptible to greedy algorithms. Spanning tree algorithms are one such example of greedy-basis (Matroid) algorithms. Another (non-Matroid) example is an almost completely derandomized parallel bipartite perfect matching algorithm. This is a recent result which one of my professors recently published.

As a computer scientist, you should very much care about linear algebra. In fact, I would encourage taking the proofs-based vector spaces version. If you have to choose between the proofs-based version and a numerical version, take the proofs-based version. It will pay off more in the long run. It is also very different from what you have covered in Calc 3.