I have C processors and $C$ items that have to be run on it. I can either run each item on a seperate processor and have a running time of $\sum_{i=1}^{c} c_i$, or divide the $C$ items into $k$ subsets of size $s_i$, $i = 1,\dots, k$, $\sum s_i = C$, and run each subset on $s_i$ processors with a running time associated with the subset. If we pick a subset the order is determined, so each subset is only associated with 1 running time costs.
I see in this problem a set partition problem with an objective function, or a special case of the knapsack problem, but I can't figure out whether this problem in NP-hard and if it is, which approximation algorithms I might use. Any help would be very welcome.
Since there are $2^C$ subsets, your input size is $2^C$. Meanwhile your search space is a partition into subsets, which has complexity of roughly ${(0.792C / \log C)}^C$ (see e.g. http://en.wikipedia.org/wiki/Bell_numbers#Growth_rate). In terms of the growth rate $Y$ of search space vs. input size $X$, you get the relationship $Y = X^{O({\log C})}$. Problems with that kind of complexity relationship between inputs and search space size are often generally believed to be NP-hard if no polynomial time solution can be found, but are notoriously hard to actually prove to be NP-hard because the search space size vs. input size for known NP-hard problems is generally pure exponential or worse, rather than having a double-logarithmic exponent. ($\log_2 C = \log_2 \log_2 2^C$, where $2^C$ is the input size).