Euler's identity, obviously, states that $e^{i \pi} = -1$, deriving from the fact that $e^{ix} = \cos(x) + i \sin(x)$. The trouble I'm having is that that second equation seems to be more of a definition than a result, at least from what I've read. It happens to be convenient. Similarly, the exact nature of using radians as the "pure-number" input to trig functions is a similar question of convenience -- would it be fundamentally wrong to define sine and cosine as behaving the same way as they do now, except over a period of $1$ rather than $2 \pi$? In such a system, $e^{i \pi} = \cos(\pi) + i\sin(\pi) = \cos(\pi - 3) + i\sin(\pi - 3)$, or transforming back into our $2\pi$-period system to get a result, $\cos(\pi\frac{(\pi - 3)}{1}) + i\sin(\pi\frac{(\pi - 3)}{1})$, which is approximately $0.903 + 0.430i$. (Hopefully I did that right.)
Since there are equally mathematically true systems where $e^{i \pi}$ gives you inelegant results, I'm asking whether the fact that $e^{i \pi} = -1$ really demonstrates some hidden connection between $e$ and $\pi$ and the reals and imaginaries, as it rests largely on what seem to me to be arbitrary definitions of convenience rather than fundamental mathematical truths.
The real function $\exp: \mathbb R \to \mathbb R$ is the unique solution to the initial value problem $f'(x)=f(x)$ and $f(0)=1$. The complex function $\exp: \mathbb C \to \mathbb C$ is defined exactly the same way - as the unique complex-differentiable function satisfying the same initial value problem. The fact that such a solution exists over $\mathbb C$ is not obvious at all; it has to be proven. The easiest way to prove it is using the Taylor Series expansion of $\exp(z)$. In fact this is the unique way of extending $\exp$ from $\mathbb R$ to $\mathbb C$ while keeping the condition that it be differentiable.
It just so happens that, when $\exp$ is defined this way, we get the identity $\exp(ix) = \cos(x)+i\sin(x)$, where $\cos$ and $\sin$ use radians. There is no choice here; this is simply what comes out of defining $\exp$ in the only manner possible suited for analysis.
To see this using Taylor series, you need to know what the Taylor series for $\exp$, $\cos$, and $\sin$ are, where $\cos$ and $\sin$ use radians. I'll assume these are known, and they are:
$$\exp(z) = \sum_{k=0}^\infty \frac{z^k}{k!} = 1+z+\frac{z^2}{2!}+\frac{z^3}{3!}+\frac{z^4}{4!} + \cdots,$$
$$\sin(z) = \sum_{k=0}^\infty (-1)^k\frac{z^{2k+1}}{(2k+1)!} = z - \frac{z^3}{3!}+\frac{z^5}{5!}-\frac{z^7}{7!}+\cdots,$$
and
$$\cos(z) = \sum_{k=0}^\infty (-1)^k\frac{z^{2k}}{(2k)!} = 1-\frac{x^2}{2!}+\frac{x^4}{4!}-\frac{x^6}{6!}+\cdots.$$
The idea is simply to put $ix$ in for $\exp(z)$. I'll just show what you get with the first few terms:
$$\exp(ix) = 1+(ix)+\frac{(ix)^2}{2!}+\frac{(ix)^3}{3!}+\frac{(ix)^4}{4!}+\frac{(ix)^5}{5!}+\cdots$$ $$=1+ix-\frac{x^2}{2!}-i\frac{x^3}{3!}+\frac{x^4}{4!}+i\frac{x^5}{5!}-\cdots$$ $$=\left(1-\frac{x^2}{2!}+\frac{x^4}{4!}-\cdots\right)+i\left(x-\frac{x^3}{3!}+\frac{x^5}{5!}-\cdots\right)$$
where I've collected real and imaginary parts in the last step. The observation is simply that the real part matches the taylor series for $\cos(x)$ and the imaginary part matches the taylor series for $\sin(x)$.