This is just idle curiosity. Consider the function $(\lambda, n) \mapsto e^{-\lambda} \frac{\lambda^n}{n!}$, where $\lambda \in \mathbb{R}_{\ge 0}$ is a nonnegative real parameter and $n \in \mathbb{Z}_{\ge 0}$ is a nonnegative integer parameter. This function has the very funny property of being a "doubly stochastic matrix" in the following sense: we have both
$$\int_0^{\infty} e^{-\lambda} \frac{\lambda^n}{n!} \, d \lambda = 1$$
(the integrand being, for fixed $n$, the probability density function of a sum of $n + 1$ exponential random variables $\text{Exp}(1)$, or an Erlang random variable $\text{Erlang}(n+1, 1)$) and
$$\sum_{n \ge 0} e^{-\lambda} \frac{\lambda^n}{n!} = 1$$
(the summand being, for fixed $\lambda$, the probability density function of a Poisson random variable $\text{Pois}(\lambda)$).
Question: What significance, if any, does this observation have?
What this means concretely is that $e^{-\lambda} \frac{\lambda^n}{n!}$ can be used as a "kernel" that converts between probability distributions on $\mathbb{Z}_{\ge 0}$ and probability distributions on $\mathbb{R}_{\ge 0}$, in either direction. The two descriptions of this function above also have the funny implication that for large $n$ as a function of $\lambda$ we have a Gaussian approximation, and the same for large $\lambda$ as a function of $n$, as a result of applying the central limit theorem first to a sum of exponential random variables and then to a sum of Poisson random variables.
I'm not certain whether this is the kind of answer you're looking for, but here are two ways to see this:
Consider a point process on the positive real line. Define the "matrix" $$A_{t, n} = \mathbb{P}(n\text{ points in }[0, t])$$ where $t \in \mathbb{R}_{\geq 0}$ and $n \in \mathbb{Z}_{\geq 0}$.
Trivially, we have $$ \sum_{n = 0}^{\infty} A_{t, n} = 1$$ We also have: $$ \int_0^{\infty} A_{t, n} dt = \int_0^{\infty} \mathbb{E} [[n\text{ points in }[0, t] ]] dt = \mathbb{E} \int_0^{\infty} [[n\text{ points in }[0, t] ]] dt = \mathbb{E}[\text{length of interval between point }n\text{ and point }n+1] $$
Thus, if we take a point process with all interarrival times having mean $1$, $A_{t, n}$ will be doubly stochastic. Specializing to the Poisson point process with rate $1$ gives the desired example.
Another way to derive the result is to note the identities: $$\sum_{k=0}^n \binom{n}{k} p^k (1-p)^{n-k} = 1$$ $$ \int_0^1 \binom{n}{k} p^k (1-p)^{n-k} dp = \frac{1}{n+1}$$ Substituting $p = \frac{t}{n}$ and taking $n \to \infty$ recovers the desired identities.
These have a similar interpretation to the above. Let $x_1, x_2, \ldots, x_n$ be independently drawn from the uniform distribution on $[0, 1]$. Then, $$B_{p, k} := \mathbb{P}(k\text{ points in }[0, p]) = \binom{n}{k} p^k (1-p)^{n-k}$$ Again, the fact that row sums are all $1$ is trivial, and the other identity is equivalent to: $$\mathbb{E}[x_{(k+1)} - x_{(k)}] = \frac{1}{n+1}$$ for all $k$, where $x_{(k)}$ are the order statistics of $x$, and we define $x_{(0)} := 0$ and $x_{(n+1)} := 1$. This can be derived using a well-known symmetry argument: $\mathbb{E}[x_{(k+1)} - x_{(k)}]$ is the probability that another, independent, uniform random variate $x^{*}$ lies between $x_{(k)}$ and $x_{(k+1)}$. This is the probability that $x^{*}$ is the $(k+1)^{\text{st}}$ order statistic of the combined collection of $n+1$ random variables $x^{*}, x_1, x_2, \ldots, x_n$. These are iid, and thus exchangeable, and so this probability is $\frac{1}{n+1}$, as desired.