Question
Fix two integers $M>N>0$. Suppose $a_i=e^{-N}\cdot\frac{N^i}{i!}, i\ge 0$. What is the value of the following limit: $$\lim_{k\rightarrow\infty}\textstyle\sum\{a_{i_1}a_{i_2}\ldots a_{i_k}\mid i_1+i_2+\cdots+i_k\leq kM,i_2+\cdots+i_k\leq (k-1)M,\ldots,i_k\leq M\}.$$
Backgrounds
I'm studying a queuing system with Poisson arrivals of rate $\lambda$ and constant services of rate $1/T$, however it can service at most $M$ customers all at once. Let $N=\lambda T$, and let $p_i$ be the stationary probability that the size of queue is $i$ immediately after one service, then the above limit evaluates $p_0$.
Since the status of the queue after each service can be regarded as a Markov chain, we can write a equation on the generating function $g(x)=\sum_{i=0}^\infty p_ix^i$: $$g(x)=S^i[g(x)e^{N(x-1)}].$$ Here $S$ is the shift functor $$S(r_0+r_1x+r_2x^2+r_3x^3+\ldots)=r_0+r_1+r_2x+r_3x^2+\ldots$$ and I found the equation really hard to solve.
Also notice two almost trivial bounds: $$p_0>\lim_{k\rightarrow\infty}{\textstyle\sum}\{a_{i_1}a_{i_2}\ldots a_{i_k}\mid i_1\leq M,i_2\leq M,\ldots,i_k\leq M\}=\lim_{k\rightarrow\infty}c^k=0,$$ $$p_0<\lim_{k\rightarrow\infty}{\textstyle\sum}\{a_{i_1}a_{i_2}\ldots a_{i_k}\mid i_1+i_2+\cdots+i_k\leq kM\}=\lim_{k\rightarrow\infty}\mathbb{P}[\mathcal{N}(0,1)\leq {\textstyle\frac{M-N}{\sqrt{N}}\sqrt{k}}]=1.$$
Yet the shape constrained by $i_1+i_2+\cdots+i_k\leq kM,i_2+\cdots+i_k\leq (k-1)M,\ldots,i_k\leq M$ is so asymmetric that I don't know how to deal with it.
By symmetry, your expression is equal to $$ \sum_{A_k} a_{i_1} \dots a_{i_k}, $$ where $A_k = \{(i_1,\dots,i_k): i_1\le M, i_1+i_2\le 2M,\dots,i_1+\dots + i_k\le kM\}$. This is nothing else but the probability that $S_1\le 0,S_2\le 0,\dots,S_n\le 0$, where $S_k = Y_1+\dots + Y_k$, $Y_k = X_k - M$, $X_k$ are iid $\mathrm{Poi}(N)$. So the probability you are interested in is $$p_\infty:=\lim_{n\to\infty} \mathbb P(S^*_n= 0) = \mathbb P(S^*_\infty= 0),$$ where $S_n^* = \max_{0\le k\le n} S_k$. Thus your problem is related to determining the distribution of maximum of a random walk, which is rather classical.
We can use the Spitzer identity: for any $\lambda>0$, $|s|<1$ $$ \sum_{n=0}^\infty s^n\mathbb{E}e^{-\lambda S^*_n} = \exp\left\{\sum_{n=1}^\infty \frac{s^n}{n}\mathbb E e^{-\lambda S_n^+}\right\}, $$ where $S_n^+ = \max\{S_n,0\}$. Letting $\lambda\to\infty$ yields $$ \sum_{n=0}^\infty s^n \mathbb{P} (S_n^* = 0) = \exp\left\{\sum_{n=1}^\infty \frac{s^n}{n} \mathbb P(S_n\le 0)\right\}. $$ Since $\mathbb{P} (S_n^* = 0)\downarrow p_\infty$, $n\to\infty$, it is easy to see that $$ \sum_{n=0}^\infty s^n \mathbb{P} (S_n^* = 0)\sim \frac{p_\infty}{1-s},\ s\to 1-. $$ On the other hand, $$ \exp\left\{\sum_{n=1}^\infty \frac{s^n}{n} \mathbb P(S_n\le 0)\right\} = \exp\left\{\sum_{n=1}^\infty \frac{s^n}{n} - \sum_{n=1}^\infty \frac{s^n}{n}\mathbb P(S_n> 0)\right\} \\= \exp\left\{-\log(1-s) - \sum_{n=1}^\infty \frac{s^n}{n}\mathbb P(S_n> 0)\right\} = \frac{1}{1-s}\exp\left\{ - \sum_{n=1}^\infty \frac{s^n}{n}\mathbb P(S_n> 0)\right\}\\\sim \frac{1}{1-s}\exp\left\{ - \sum_{n=1}^\infty \frac{1}{n}\mathbb P(S_n> 0)\right\}, s\to 1-, $$ whence $$ p_\infty = \exp\left\{ - \sum_{n=1}^\infty \frac{1}{n}\mathbb P(S_n> 0)\right\} = \exp\left\{ - \sum_{n=1}^\infty \frac{1}{n}\mathbb P(\mathrm{Poi}(Nn) > Mn)\right\}.\tag{1} $$ I believe you will be able to compute the last expression. (By the way, note that when $N\ge M$, the series under exponent is divergent, leading to $p_\infty = 0$, as expected.)
As @Did suggested, it is possible to derive from $(1)$ a good lower bound for $p_\infty$ using the large deviation approach.
To this end, apply the Markov inequality $$ \mathbb{P}(\mathrm{Poi}(Nn) > Mn) < \frac{\mathbb{E} e^{\lambda\mathrm{Poi}(Nn)}}{e^{\lambda Mn}} = e^{Nn (e^\lambda -1) - \lambda Mn} $$ with $\lambda = \log \frac{M}{N}$, which, upon plugging into $(1)$, eventually leads to $$ p_\infty > 1- e^{M-N}\left(\frac{N}{M}\right)^M. $$