How far can you get in math by doing random problems and generalizing them in as many possible ways as you can imagine?

47 Views Asked by At

Off course 1) the notion of 'far' would have to be defined. As well as 2) what counts as a 'random problem'. And of course 3) different people can think of different generalizations and can do the problems at different rates.

Let's say 1) An undergraduate degree in mathematics, just to have some milestone. 2) Let's say you take the standard undergraduate mathematics books used in universities and you pile all the problems on a big heap, you randomly take a problem one after the other. If you come across a problem that is included in some generalization of another one, you skip it. 3) Let's say we're talking about an average undergraduate in mathematics who is beginning his studies.

1

There are 1 best solutions below

0
On

Not far at all. On top of that undergraduate math you mention, thousands over thousands of mathematicians, decade after decade, have put a lot of study, thinking, re-thinking, picking the best ideas, re-writing, refining.

By skipping all that knowledge, even if you were the new Gauss (or Cauchy, or Euler, or take your pick) you would be starting decades (if not centuries) behind current knowledge, so at best you could aspire to re-discover a tiny part of what is already in some graduate textbook in your library. And you would be not even aware of most modern math.