Among all areas of mathematics, linear algebra is incredibly well understood. I have heard it said that the only problems we can really solve in math are linear problems- and that much of the rest of math involves reducing problems to linear algebra?
But, what about the exchange of a "multiplication" operation and an "addition" operation is nice? Why is this interchange desirable, and why - among the many possible properties to specify, is linearity so important?
Specifically, I am looking for:
An idea of why the exchange that linearity allows is so powerful - whether appealing to some categorical or other argument about why these particular rules are so powerful
An idea of why linear problems, or linearization, shows up so frequently
Linear problems are so very useful because they describe well small deviations, displacements, signals, etc., and because they admit single solutions. For sufficiently small $x$, $f(x) = a_0 + a_1 x + a_2 x^2 + a_3 x^3 + \ldots$ can be very well approximated by $a_0 + a_1 x$. Even the simplest nonlinear equation, $x^2 - 3 = 0$ has two real solutions, making analysis more difficult. Linear equations have a single solution (if one exists).