Warning! This question has little rigour and is entirely hand wavy crazy blue-sky speculative thinking so I apologise in advance.
I was thinking about the gamma function and how it interpolates the discrete integers $(n-1)!$ in $\mathbb{R}^+$. Going from $n!$ to continuous $\Gamma(n+1)$ isn't like e.g going from $\sum_\mathbb{N}f(k)$ to $\int{f(x)\mathrm{d}x}$. Things change when we switch sum signs to integral signs. And it's not like we calculate $\Gamma(x+1)$ by calculating the product of all values $1$ to $x$ - that would obviously blow up to $\infty$ since $\mathbb{R}$ is uncountably infinite and in measure $\mu([1,x])=\infty$. I am familiar with the usual gamma function definition as an indefinite integral, but it still got me thinking...
Take the familiar Riemann sum formulation of integration. Intuitively we can think of it as a sort of 'continuous' or 'fluid' sum rather than the usual discrete sum. The sum has an infinite number of summands $f(x^*)$, weighted by $\Delta x$ so that it stands a chance of converging. You take the limit as $\Delta x$ goes to $0$ and the number of intervals goes to $\infty$ $$\int{f(x)\mathrm{d}x}=\lim\limits_{\Delta x \to 0}\sum f(x^*)\Delta x$$
Can you extend the idea of Riemann sums to 'Riemann products'?
We are taking a product of infinitely many values so for this to be remotely likely to converge the terms would have to be infinitesimally close to $1$. So instead of summing terms $f(x^*)\Delta x$ like the usual Riemann sum, we take an infinite product of terms $f(x^*)^{\Delta x}$ $$\bigotimes{f(x)^{\mathrm{d}x}}=\lim\limits_{\Delta x \to 0}\prod f(x^*)^{\Delta x}$$
Does it exist? When will it converge? When will it blow up? Can you abuse the properties of logarithms $\log(ab) = \log(a) + \log(b)$ in some way to turn it back into a regular integral? $$\bigotimes{f(x)^{\mathrm{d}x}}=\lim\limits_{\Delta x \to 0} \exp\left( \sum \Delta x\log(f(x^*))\right)$$
If you take the identity function $f(x)=x$ for example in Mathematica:
Limit[Product[Identity[x - k d]^d, {k, 0, x/d}], d -> 0, Assumptions -> {x > 0}]
... after a lot of computation it finally returns this weird result:
E^-x x^x
But using a Monte-Carlo setup instead gives quite different values and without more rigour it's difficult to know who's right.
$$\bigotimes_{0}^{2}{x^{\mathrm{d}x}} \approx e^{\mathbb{E}\left[\log(x)\right]},\ X\thicksim U(0,2)$$
Exp[Mean[Log[RandomReal[{0, 2}, 1000000]]]] 0.734764
E^-2 2^2. 0.541341