Here are some examples of problems that I want to be able to solve for symbolically:
Let $R = \sqrt{x_k x_k}$. Calculate $\displaystyle{\partial R \over \partial x_i}$ and $\displaystyle{\partial^2 R \over \partial x_i \partial x_j}$
Given that tensor R has component of $R_{ij} = \cos(\theta) \delta_{ij} + n_i n_j(1 - \cos (\theta)) - \sin(\theta) \varepsilon_{ijk} n_k$, where $n_k$ are the unit vector components, calculate $R_{ik} R'_{jk}$
Let a tensor A be given by: $$ \mathbf{A} = \alpha(\mathbf{I} - \mathbf{e_1} \otimes \mathbf{e_2}) + \beta (\mathbf{e_1} \otimes \mathbf{e_2} + \mathbf{e_2} \otimes \mathbf{e_1})$$ Calculate the eigenvalues and derive the associated normalized eigenvectors.
In other words, I want to be able to use some package that has symbolic manipulations that can work with these sorts of problems. The emphasis is on symbolic - I am dealing with theoretical problems, not once that I have numerical answers in. I need the inputs to be symbolic and the outputs to be symbolic, with both in the form of the examples that I have given. These sorts of theoretical problems are also not given to me in Voigt notation, which I want to avoid using if possible. If I have any software that I would be biased towards wanting the package to be in, then I would prefer Mathematica. However, any software package that meets my need here would be highly appreciated. Half a year ago I attempted to find a package that did this, and ended up wasting four hours because the package could only spit out numerical answers.
There are several software packages available, Xact and Cadabra come to mind. However, the examples you provide are easily handled using the the built in Mathematica commands TensorProduct and TensorContract.
I will illustrate using your first example. Let $x_k=(x_1,x_2)$ be a coordinate vector. We can start by making a new tensor $T_{jk}=x_jx_k$ using TensorProduct:
Next we contract the two indices using TensorContract:
Finally we take the square root and inspect the result:
$R=\sqrt{x_1^2+x_2^2}$
Now if we want to calculate the gradient, we just apply the D operator on R, $T_i=\frac{\partial R}{\partial x_i}$:
Or even two times for the Hessian $T_{ij}=\frac{\partial^2 R}{\partial x_i\partial x_j}$
$$\left( \begin{array}{cc} \frac{x_2^2}{\left(x_1^2+x_2^2\right){}^{3/2}} & -\frac{x_1 x_2}{\left(x_1^2+x_2^2\right){}^{3/2}} \\ -\frac{x_1 x_2}{\left(x_1^2+x_2^2\right){}^{3/2}} & \frac{x_1^2}{\left(x_1^2+x_2^2\right){}^{3/2}} \\ \end{array} \right)$$ Finally,just as a neat trick, we contract $T_{ij}$ (which is the same thing as the matrix trace in this case)
$$T_{kk}=\frac{1}{\sqrt{x_1^2+x_2^2}}$$ Your next examples are also trivial, use the built in command LeviCivitaTensor for the Levi-Civita "tensor".
Full Mathematica code for this illustration: