Let $N$ be an integer such that $N=a_n\cdot a_{n-1}\cdots a_0$ is a decomposition into $n+1$ arbitrary integers.
Is there a rule to get the biggest possible number, when taking the factorials of each element of the decomposition and multiplying them together?
$$a_n!\cdot a_{n-1}!\cdots a_0!$$
By rule I mean can we find the best amount of elements for a decomposition like $n=2$ or $n=4$. And also how should those elements be chosen? More equally distributed?
So for example we have $N=100$ and compare a decomposition of $n=3$ with $n=2$. Let's chose $$2\cdot 5 \cdot 10=10\cdot10=100$$
Then we clearly have $$870912000=2!\cdot 5!\cdot 10! \leq 10!\cdot 10! = 13168189440000 $$
So the decomposition $10\cdot 10$ gives a bigger result. But is it the biggest?
EDIT: So the answer will be $n=1$. What about if we have to chose $n\geq 2$? So atleast we have a decomposition of $2$ elements?
For any pair of positive numbers $a$ and $b$ we have $$a!b!=\prod_{i=1}^ai\prod_{j=1}^bj<\prod_{i=1}^ai\prod_{j=1}^b(a+j)=(a+b)!,$$ so in particular if $a>1$ and $b>1$ (so that $a+b\leq ab$) it follows that $a!b!<(ab)!$. Of course if either $a=1$ or $b=1$ then we have $a!b!=(ab)!$. This means that the maximum is always attained with the smallest number of factors.
As for the edited question; if multiple factors are required then you'll need to allow trivial factors, otherwise the question makes no sense for primes if $n\geq2$. The argument above shows that the maximum is then attained for the trivial decomposition $N=N\times1\times1\ldots\times1$.