This one weird thing that bugs me about summation and the like

1.9k Views Asked by At

Most of us know $$\sum_{n=a}^b c_n=c_a+c_{a+1}...+c_{b-1}+c_b$$ Some of us know $$\prod_{n=a}^b c_n=c_a \cdot c_{a+1}...c_{b-1} \cdot c_{b}$$ A few of us know $$\underset{j=a}{\overset{b}{\LARGE\mathrm K}}\frac{a_j}{b_j}=\cfrac{a_1}{b_1+\cfrac{a_2}{b_2+\cfrac{a_3}{b_3+\ddots}}}$$ A select few are familiar with $$\underset{j=1}{\overset{b}{\LARGE\mathrm E}} \ c_n={c_a}^{{.}^{{.}^{c_b}}}$$ All of the above are related by difference equations of the form $$f(n+1)=g(f(n))$$ In fact I'd go so far as to say all of the above and more can be linked with the following notation (iterated composition) $$\underset{n=1}{\overset{k}{\LARGE\mathrm F}} \ f(n,x)=f(1,f(2,f(3,...f(n,x)...)))$$ (It should be noted that in addition to the above mentioned items, newton's method, Mandelbrot's equation, and logistic equation fall into this category, so this notation has incredible generality)

Questions: Can you derive Taylor like methods that allow for the expression of many functions in terms of these other lesser known forms of recursion? Does this play a role in why summations are more famous than these other forms of recursion?

Is convergence an issue, most people are familiar that once you move into difference equations you start having to deal with problems like chaos. Which forms of recursion preserve convergence? Is this a factor in why we don't use other things besides summation?

Is it a historical problem. Does the reason summation is more famous have to do with integration. Does this suggest there is an ALTERNATIVE way to integrate? For instance could an infinite product conceivably integrate a function to the same expression as a standard integral? In other words, can you develop a analogous integral with the above formulations (excluding product integral as it isn't equivalent). Considering multiplication can yield areas of squares and cubes, it wouldn't be to much of a stretch, right? I'm extremely interested in that possibility.

Preferences: I'd like references and examples. I'm not really interested in the "this isn't possible" kind of rhetoric, but I will accept proofs that disprove the existence of the above mentioned items.

3

There are 3 best solutions below

3
On BEST ANSWER

Maybe https://en.wikipedia.org/wiki/Iterated_function#Some_formulas_for_fractional_iteration and https://en.wikipedia.org/wiki/Ramanujan_summation can bring some light, but i think it's easy find oneself lost in greater levels of abstraction, without good references or specific examples.

0
On

These are all just variations on looping, which appears in protean guises in programming.

0
On

The abstraction you hint at is a standard example in Scheme or LISP programming. This is from page 64 of Abelson and Sussman's classic Structure and Interpretation of Computer Programs (http://web.mit.edu/alexmv/6.037/sicp.pdf):

Exercise 1.32.

Show that sum and product (exercise 1.31) are both special cases of a still more general notion called accumulate that combines a collection of terms, using some general accumulation function:

(accumulate combiner null-value term a next b)

Accumulate takes as arguments the same term and range specifications as sum and product, together with a combiner procedure (of two arguments) that specifies how the current term is to be combined with the accumulation of the preceding terms and a null-value that specifies what base value to use when the terms run out. Write accumulate and show how sum and product can both be defined as simple calls to accumulate.

This construction doesn't address any of your questions about convergence or integration. It can be further generalized to deal with potentially infinite streams of values to be combined, should you wish to pursue that.