In computer programming it's common to define "functions" (well not really functions) that have a state that persists between calls. For example, in Python and I want to define a temporal difference and running sum operations:
class TemporalDifference:
def __init__(self, initial=0):
self.last_x = initial
def __call__(self, x):
delta = x-self.last_x
self.last_x = x
return delta
class RunningSum:
def __init__(self, initial=0):
self.sum = initial
def __call__(self, x):
self.sum = self.sum + x
return self.sum
# Which can be used like:
td = TemporalDifference() # A stateful function
dx = [td(xt) for xt in np.sin(np.linspace(0, 10, 100))]
Is there a common convention for expressing this kind of thing mathematically? I was thinking something like
$$ \Delta(x; x_{last}) := x-x_{last} : x \rightarrow x_{last} \\ \Sigma(x; s) := s+x : s+x \rightarrow s $$
I'd like to use this notation to conveniently express identities like:
$$ (\Sigma \circ \Delta) (x_t) = x_t \forall t $$
Yes. These are common filters that you'll encounter if you study signal processing. The TemporalDifference is an FIR filter and RunningSum is an IIR filter.
TemporalDifference -> y[n] = x[n] - x[n - 1]
RunningSum -> y[n] = x[n] + y[n - 1]
They are both discrete time systems so brackets are used instead of parentheses. I'd suggest you study the basics of signal processing to get a better idea of how to characterize different systems rather than relying on terms like stateful or stateless (i.e. causal, stable, time invariant, memoryless etc).