Given is a random sample of size n from a uniform distribution with parameters $-\theta$ and $\theta$, $\theta>0$. I'm asked to find a constant $c$ such that $c(X_{n:n}-X_{1:n})$ is an unbiased estimator of $\theta$. Thus I need to show that $E[c(X_{n:n}-X_{1:n})]=cE[(X_{n:n}-X_{1:n})]=\theta$. How do I find the expectation of $X_{n:n}-X_{1:n}$?
I've tried finding it by first calculating $X_{n:n}$ and $X_{1:n}$ separately. I found $X_{n:n} = (\frac{x+\theta}{2\theta})^n$ and $X_{1:n} = 1 - [1 - \frac{x+\theta}{2\theta}]^n$. But I don't know how to calculate the mean of the difference.
You have found the cumulative distribution functions of the two random variables. (It is not correct to say that the random variables are "equal" to their cdf.)
The maximum is easier to handle than the minimum. The density function (find the derivative) is $\frac{n}{2\theta}\left(\frac{x+\theta}{2\theta}\right)^{n-1}$, between $-\theta$ and $\theta$. Multiply by $x$, and integrate from $-\theta$ to $\theta$. That will give you the mean of the maximum.
The procedure for the minimum is similar, but the calculation, if you do it in the same way, is more unpleasant. So I would exploit the symmetry, and note that the mean of the minimum is the negative of the mean of the maximum. Thus the expectation of the difference is twice the expectation of the maximum. So we did not actually need to find the cdf of the minimum, though doing so is a good exercise.