Sigma summation notation formalization

530 Views Asked by At

When we have a basic sum of the form $ \sum_{n=1}^k f(n) $ we can give a recursive definition as $ \sum_{n=1}^{k-1} f(n) + f(k) $ with a base case of zero, and by the Recursion Theorem this is well-defined. My question is, how do we give "formalizations" (if need be, say in ZFC, but the exact formal system is likely unnecessary) to more exotic types of sums and inductive structures that appear in mathematics?

Some examples:

$$ \sum_{j+k=5}(j+k), j \in \mathbb{N}, k \in \mathbb{N} $$

$$ \sum_{x \in S} $$ where S is a finite set with addition of two terms defined, but not necessarily coming with an order

$$ \prod_{p \text{ prime}}\frac{1}{p} $$

$$ \sum_{z \in \mathbb{Z}}\frac{1}{z^3} $$ Also, as order matters for conditionally convergent series, do we need to fix a specific order when we sum infinite terms? Is it implicitly usually assumed to be under the standard ordering? For finite sums, the order doesn't matter (is there a simple proof?), so is it from the definition of finite that we are allowed to choose a specific bijection with $[1..n]$ to sum the terms?

1

There are 1 best solutions below

2
On BEST ANSWER

For all the finitary examples, we can consider the sum of a finite multiset. Why a multiset? One way to think about what's going on is to think of $\sum_{x\in M}f(x)$ where $M$ is a multiset as being a term $f(x_1)+f(x_2)+\cdots+f(x_n)$ as is usually suggested. For this to be well-defined, we need $+$ to be associative (so we don't need to worry about how to place the parentheses), commutative (so we don't need to specify an order), and have a unit (so we know what to do when $m$ is empty).

If we consider all terms given a set "variables" in a language with one formal binary operator, $+$, and one constant symbol $0$ satisfying $0+x=x=x+0$, $(x+y)+z=x+(y+z)$, and $x+y=y+x$, then we get the free commutative monoid on that set which corresponds to finite multiset built on that set. (If we additionally required the operator to be idempotent, i.e. $x+x=x$, then the terms would correspond to finite sets, but, obviously addition on real numbers, say, is not idempotent.)

"Summation" is then the unique commutative monoid homomorphism from the free commutative monoid (i.e. from multisets) on some set into any other commutative monoid given a function from that set into the latter commutative monoid that coincides on "variables". In symbols, let $F(X)$ be the free commutative monoid generated from the set $X$. Given a function, $f:X\to C$ where $C$ is (the carrier set of) a commutative monoid, we get a (commutative) monoid homomorphism $\sum_f:F(X)\to C$ such that $\sum_f \{\!\!\{x\}\!\!\} = f(x)$ for $x \in X$. Here, I'm using $\{\!\!\{x\}\!\!\}$ to represent the singleton multiset. The restriction to finite multisets (though $X$ is not required to be finite) is because we consider algebraic terms to be inductively defined which forces any term to be finitely deep, and since each operator has a finite arity this means there can be at most finitely many "variable" occurrences in a term. To be clear, whether the "summation" is actually a sum depends on the target commutative monoid, i.e. we are "summing" with respect to that monoid's operation. If it is the commutative monoid of reals equipped with multiplication, the "summation" would be $\prod$.

For summing over an infinite set, on the other hand, I strongly recommend thinking of it as just a completely different operator. You could view it as an inifintary algebraic operation but it is probably better to view it as a function defined on functions. And usually not defined on arbitrary function, but "convergent" ones. For example, if we wanted to talk about a summation like $\sum_{n\in\mathbb N}f(n)$ where $f$ is $\mathbb R$-valued say, we'd be talking about an operator $\sum:\mathbb R^\mathbb N\to\mathbb R$, but as $\sum(n\mapsto 1)$ would not produce a real number, we usually mean something like $\sum:\ell^1\to\mathbb R$ where $\ell^1$ is a sequence space. Defining such an operation typically involves analytic concerns. Note, as functions on functions the "order" does matter. If $f(0)=g(1)$ and $f(1)=g(0)$ and otherwise $f=g$, then $f$ and $g$ are different functions and need not be mapped to the same number by summation. (In practice, the summation operation is insensitive to minor changes like this, but the point is that it is not a priori clear that you can "reorder" any terms and it is not the case that you can "reorder" infinitely many of them, i.e. that $f\circ\pi=g$ for some bijection $\pi:\mathbb N\cong\mathbb N$.)

I'll briefly mention formal series. When we talk about a formal power series, say, such as $\sum_{n=0}^\infty a_n x^n$, we're "really" just talking about the sequence $\{a_n\}_{n\in\mathbb N}$ itself. We can define the operations on formal power series directly on the sequences. There are no questions of convergence. That would only come up if we wanted to "evaluate" these formal series.