Basic of mathematics compared to higher level of mathematic.

154 Views Asked by At

May, someone tell me, does one need to know the basics of algebra to calculus to understand the higher levels of mathematics, or do those rules don't apply?

I'm almost done with the basics, afterwards I plan on getting into the higher levels of mathematics and, then going to a college to test my way in.

3

There are 3 best solutions below

2
On

These basic knowledge are absolutely essential to understand higher levels, just as a writer, say Philip Roth, needs the alphabet for his novels. It would be impossible to understand, for example, the (not really of "high" level) law of quadratic reciprocity $$\left(\frac pq\right)\left(\frac qp \right)=\left(-1\right)^{\dfrac{p-1}{2}\cdot\dfrac{q-1}{2}}$$

if one could not handle the powers of $-1$ or fractional exponents.

1
On

No, you don't. In principle you could learn (up to research level) formal logic/set theory/model theory, perhaps even set-theoretic and algebraic topology, without ever knowing how to solve $x^2 - 2x + 1 = 0$. Other fields, like analysis and number theory, basic algebra is necessary.

0
On

My guess (it's a guess, because it's not something you can prove rigorously, and honestly, I don't see how one would find a volunteer to test this empirically) is that although you could technically understand some part of higher mathematics without a thorough understanding of basic algebra, it would be quite a contortionist's act. It would be a little like walking without ever putting your left foot in front of the right. Sure, you could do it, but why?

And I suspect in many ways, it's worse than that, depending on what we mean by "a thorough understanding of basic algebra." If you don't never figure out how to complete the square, then OK, maybe you just understand some problems and not others. But basic algebra also comprises a fundamental understanding of things like "multiplication commutes." Not understanding that, and not being able to comprehend why it might not always be true, and when, are going to make making any real headway into advanced mathematics pretty difficult, I would say.

This isn't like many other fields, where we often employ a lie-to-children to make things behave temporarily, to be later replaced by a better understanding that will often produce the same behavior but which is often fundamentally incompatible with the earlier lie-to-children. (Note that the lie-to-children is often used with adults; the phrase is a term of art and is not to be taken literally.) In such cases, one may well be able to ignore the earlier material, because it's mostly invalid. That isn't the case in mathematics, though, for the most part.