Certainly straightforward substitution into $$ e_k(x_1,...,x_n)=\sum_{\substack{s\,\subseteq\,\{1,...,n\}\\\text{$s$ has $k$ elements}}}\prod_{i\in s}x_i $$ is very inefficient way to compute values of symmetric functions on some concrete numbers $x_i$: the number $\binom nk$ of summands grows very rapidly.
Much more efficient is to first compute the power sums $$ p_j(x_1,...,x_n)=\sum_{i=1}^nx_i^j $$ and then use the formula $$ e_k(x_1,...,x_n) = (-1)^k \sum_{\substack{m_1 + 2m_2 + \cdots\, =\, k \\ m_1, m_2,\,\cdots\, \geqslant\, 0}} \frac{(-p_1)^{m_1}}{1^{m_1}m_1!}\frac{(-p_2)^{m_2}}{2^{m_2}m_2!}\cdots $$ Still there are as many summands as there are partitions of $k$.
Probably even more efficient are the Newton recurrent formulas $$ e_k(x_1,...,x_n)=\frac1k(p_1e_{k-1}-p_2e_{k-2}+\cdots\pm p_{k-1}e_1\mp p_k), $$ but since this requires computation of all the previous $e$'s, here I am not sure anymore whether this is more efficient than the previous one.
Is it? And are there any even more efficient ways?
There are two efficient algorithms for computing ESFs, the sum and the difference algorithm. For review of both of these see these two papers:
@article{liou1994, Author = {Liou, Michelle}, Journal = {Applied Psychological Measurement}, Number = {1}, Pages = {53--62}, Publisher = {Sage Publications Sage CA: Thousand Oaks, CA}, Title = {More on the computation of higher-order derivatives of the elementary symmetric functions in the Rasch model}, Volume = {18}, Year = {1994}}@article{baker1996, Author = {Baker, Frank B and Harwell, Michael R}, Journal = {Applied Psychological Measurement}, Number = {2}, Pages = {169--192}, Publisher = {Sage Publications Sage CA: Thousand Oaks, CA}, Title = {Computing elementary symmetric functions and their derivatives: A didactic}, Volume = {20}, Year = {1996}}