While studying transforms, I stumbled across a neat way to express sequences, and have a couple of questions about its validity and how it connects to other topics. Suppose we are working with finitely supported real-valued sequences indexed by the integers, for example $$ a : \mathbb{Z} \to \mathbb{R}, \quad k \mapsto a(k) =: a_k. $$ Let $\delta$ denote the identity with respect to convolution, i.e. $$ \delta_k = \begin{cases} 1, \quad k = 0 \newline 0, \quad k \neq 0. \end{cases} $$ Lastly, define the shift operator $T : \mathbb{R}^{\mathbb{Z}} \to \mathbb{R}^{\mathbb{Z}}$ by $(Ta)_k = a_{k+1}$. Shifts are invertible and also associative under composition, so we can talk about powers of the form $T^k$ for any $k \in \mathbb{Z}$ without issues. It is also clearly linear. Now write \begin{align} a_k &= \left( a \ast \delta \right) _k \newline &= \sum_{k' \in \mathbb{Z}} a_{k'} \delta_{k-k'} \newline &= \sum_{k' \in \mathbb{Z}} a_{k'} \left(T^{-k'} \delta \right)_{k} \newline &= \left( \sum_{k' \in \mathbb{Z}} a_{k'} T^{-k'} \delta \right)_k. \end{align} By linearity, we then have $$ a = \left( \sum_{k \in \mathbb{Z}} a_k T^{-k} \right) \delta. $$ Isn't the part in brackets essentially just the z-transform of $a$? I thought it was curious how it just kind of fell out. If we restrict to one-sided sequences and write the sum in terms of inverse shifts instead, that's basically the generating function of $a$, right? Also, writing it this way, results like the convolution theorem are super easy, which takes more or less one step to prove. Is what I've done here valid, and if not, how can I somehow make it rigorous? Lastly, I also wonder how this connects to continuous transforms like the bilateral Laplace and Mellin transforms.
Thanks!