I keep seeing power series throughout mathematics disguised in all different shapes, yet I can't seem to put my finger on what is really fundamentally being expressed here.
Some examples:
Arabic numerals are just a condensed notation for a finite power series: $$ 123 = 3*10^0 + 2*10^1 + 1*10^2 $$
Binary and Hex just replace the 10 with 2 or 16 respectively and adjust some coefficients.
Then there's the Geometric Series, Taylor Series, Fourier Series, Laplace Transform, Z-transform, Moment Generating Functions, Probability Generating Functions, Formal Power Series, Generating Functions.
Some of these are synonymous to varying degrees, some are special cases of one another. But I can't quite bring order into this whole mess.
Edit: Let me elaborate on this question a little:
In the case of the Fourier Series an analogy that is often made is that of a projection onto an orthogonal basis in linear algebra. We create a new world who's atoms are powers of complex exponentials and reconstruct our old object by combining these new atoms into the same shape as before. It's a little like swapping out the periodic table or (a little more realistically) re-encoding a picture in a different file format.
The purpose of this change of reference frame is to hopefully expose patterns that weren't previously visible. An analogy that Douglas Hofstadter makes in Gödel, Escher, Bach is looking at a vineyard from different angles. What seems like chaos from one angle can be highly ordered when viewed from another.
Viewed in this light the question that haunts me with power series is simply: what is it about successively rising powers of x that makes it so natural to use them as a basis for these new spaces? Looking at linear algebra it strikes me as completely obvious why the euclidean basis vectors are so often chosen. But powers of x offer no further intuition to me at all.
As your question lacks an actual question, it's kinda hard to answer. But I'll give it a shot.
In my view, power series sit somewhere between Algebra and Analysis and are often a tool to switch from one point of view to the other. As an example, take generating functions.
Let $\mathcal A$ be a set with a size function $|\cdot|$ such that, for every $n \in \mathbb N$, the number of elements of size $n$ is finite. Formally, you simply define the power series $$ A(z) = \sum_{n \ge 0} A_n z^n $$ where $A_n = | \{ a \in \mathcal A \mid |a|= n\}|$ is the size of the set of all elements in $\mathcal A$ of size $n$. This (formal) power series is known as the generating function of the set $\mathcal A$. Up until here, this is a purely algebraic definition. We don't need to consider "analytical" problems like radius of convergence of the power series or such things. And using algebraic operators like $+$ or $\cdot$ on different generating functions $A(z)$ or $B(z)$ directly corresponds to set-theoretic actions on the corresponding sets - in this case the disjoint union for the prior or the cartesian product for the latter.
Now while you can define the generating function of a combinatorial class in a purely algebraic manner, generating functions really become interesting when we also consider them as analytical objects, in particular as functions in the complex plane $\mathbb C \to \mathbb C$. Here, it turns out that knowing the radius of convergence and the type of singularity on the radius of convergence actually tells us the asymptotic growth order of $A_n$!
So while this example might seem a bit limited to generatingfunctionology, translating a problem to a power series offers the possibility to use the methods of complex analysis. And these are many.