What is the best known algorithm for dividing a string of arbitrary integer length into equal pieces of integer length greater than 1?
In other words if integer N is expressed in unary as N 1's, is there an optimal algorithm for partitioning these N 1's into equal pieces all with the same number of 1's.
This is integer factorization of $N$.
If you allow quantum computing, then it's Shor's algorithm which performs in polynomial time in $\log N$.
For classical computers, the best is the general number field sieve.
Notice however, that this "best" is only with respect to complexity, and algorithms like number field sieve have big $\cal O$-constants. This means that these algorithms win for big $N$, but for small numbers or special numbers other algorithms might outperform them.
Other algorithms are elliptical curve method, Pollard's $\varrho$ method, continued fractions, etc., see for example this list.
...as you are talking about "string" and "computation", if it's about a real string in programming (a sequence of characters held in some memory), then the string is likely to be short, i.e. $N\ll2^{64}\approx 10^{21}$.
After you removed small factors from $N$ (by trial division or computing $\gcd(N,2·3·5·7·11·13·17\cdots)$) use Pollard's $\varrho$ which is easy enough to implement from scratch. All you need is modular arithmetic over the integers.
Being pedantic, forming one pile of size $N$ solves the problem.