I remember hearing my professor talk about how $e^x$ shows up in all our differential equations because it is the eigenvector for the derivative operator. Can someone explain and prove this to me?
I have taken Linear algebra and a course on ODEs and a little bit of PDEs.
EDIT: Specifically I am wondering: know how you take a matrix that represents a linear operator and subtract lambda off the diagonals and then solve for the eigenvalues and eigenvectors? Is there a similar proof that results in e^x?
The problem with what you want is that when we use matrices, we usually work in a finite-dimensional vector space. Yet the natural interpretation of the vector $e^x$ (i.e., the function $x \mapsto e^x$), is as a member of some infinite-dimensional vector spaces (like, say, the vector space of all suitably convergent power series, or of differentiable functions, or something like that). In such vector spaces, matrix algebra becomes a rather unwieldly tool, because matrices themselves then become infinitely-sized objects, and we have to either
deal with convergence issues, or
restrict matrices to a finite number of non-zero entries in every column.
I'm therefore not going to give you a formal proof, but rather a very sketchy idea of how one could proceed, as similarly to finite-dimensional linear algebra as possible, to indeed show that $e^x$ is a eigenvector of $D = \frac{d}{dx}$.
We're going to work in a vector space $V$ of suitable convergent power series, but I'm going to ignore convergence issues mostly. We're going to treat $$ B = \left\{ b_k = \frac{x^k}{k!} \,:\, k \in \mathbb{N} \right\}, \quad\text{ with the understanding that $b_0 = 1$,} $$ as a basis of some sort, i.e. assume that we can represent each vector $v$ as $$ v = \sum_{i=1}^\infty c_ib_i = \sum_{i=1}^\infty c_i\frac{x^n}{n!} \text{.} $$ Note that $B$ isn't a basis in the usual vector-space sense, since we resort to infinite series here. It would be a basis in the hilbert-space sense, if we cared to turn $V$ into a proper hilbert space, which I won't do here. As I said, this is very sketchy.
We now observe how our differentation operator $D$ behaves on the elements of that basis $B$. We obviously have $$ Db_k = \frac{d}{dx} \frac{x^k}{k!} = \frac{kx^{k-1}}{k!} = \frac{x^(k-1)}{(k-1)!} = b_{k-1} \text{ and $Db_1 = \frac{d}{dx}1 = 0$} $$ Thus, we represented as a (infinitely large!) matrix, $D$ looks something like this $$ M_D = \begin{pmatrix} 0 & 1 & 0 & 0 & 0 & \ldots \\ 0 & 0 & 1 & 0 & 0 & \ldots \\ 0 & 0 & 0 & 1 & 0 &\ldots \\ 0 & 0 & 0 & 0 & \ddots&\ddots \\ \vdots&\vdots&\vdots&\vdots& \ddots&\ddots \end{pmatrix}\text{.} $$ At this point, we have to leave the path set out by finite-dimensional linear algebra, though, because trying to make sense of the determinant of such matrices get us into trouble. For $M_D$, we might get away with saying $\det M_D = 0$ - after all, it's a triangular matrix with only zeros in the diagonal. But how would we interpret $$ \det(\lambda I - M_D) = \left|\begin{matrix} \lambda & -1 & 0 & 0 & 0 & \ldots \\ 0 & \lambda & -1 & 0 & 0 & \ldots \\ 0 & 0 & \lambda & -1 & 0 &\ldots \\ 0 & 0 & 0 & \lambda & \ddots&\ddots \\ \vdots&\vdots&\vdots&\vdots& \ddots&\ddots \end{matrix}\right|\text{?} $$ Using the rules of finite-dimensional linear algebra, we'd have to conclude that the result is $\lambda^\infty$, which makes no sense. So instead, we directly look for eigenvectors, i.e. some $v = (c_1,c_2,\ldots)$ for which $$ M_D v = \lambda v \text{.} $$ Looking at the matrix, we can easily see that this indeed holds for $v = (1,1,\ldots)$ and $\lambda = 1$, or in other words that $$ v = (1,1,\ldots) \text{ is an eigenvector of the eigenvalue } \lambda = 1 \text{.} $$ So which function does $v$ represent? Per the definition of our basis above, it's the function defined by the power series $$ \sum_{k=1}^\infty 1 b_k = \sum_{k=1}^\infty \frac{x^k}{k!} \text{,} $$ which of course is $e^x$.