Let $X_1,...,X_n$ be an i.i.d. sample from the uniform distribution on ($-\theta$, $\theta$).
(a) Find a method of moments estimator of $\theta$.
By integration of second moment, $\mu_2=E(X^2)=\mu^2+\sigma^2=\frac{\theta^2}{3}$, which implies that $\theta=\sqrt{3\sigma^2}$.
(b) What is the approximate variance of your estimator?
For this question, I have no idea. But if it is a estimator of MLE, we can use asymptotic variance, right?
(c) Denote your estimator by $\hat{\theta}$. Find $E(\hat{\theta})$.
(d) Does $\hat{\theta}$ converge to $\theta$ in probability as $n$ goes to infinity?
First correct your method of moments estimator: what you wrote is wrong.
a) You have simply equated some parameters without any reason. What you should be actually doing is $E(X^2)=\dfrac{\sum_{i=1}^nX_i^2}{n}$. So $\dfrac{\theta^2}{3}=\sum_{i=1}^n\dfrac{X_i^2}{n}$ and so $\hat{\theta}=\sqrt{\dfrac{3\sum_{i=1}^nX_i^2}{n}}$.
b) Yes you are right in saying that you have to find the asymptotic variance. Note that $\dfrac{\sum_{i=1}^nX_i^2}{n}$ is a mean of i.i.d. r.v.'s which finite moments. Let $Y_n=\dfrac{\sum_{i=1}^nX_i^2}{n}$ and $E(X_i^2), Var(X_i^2)$ you can find out. So $\dfrac{\sqrt{n}(Y_n-E(X_i^2))}{\sqrt{Var(X_i^2)}}\to N(0,1)$
Now the function $g(x)=\sqrt{3x}$ is differentiable so $\sqrt{n}(g(Y_n)-g(E(X_i^2))\to N(0,(g'(E(X_i^2)))^2Var(X_i^2))$. Use this to find your Asymptotic variance (look up Delta Method).
c) Find out the distribution of $X_i^2$, then $\sum_{i=1}^nX_i^2$ and then $\sqrt{\sum_{i=1}^nX_i^2}$. These are all you need to find $E(\hat{\theta})$.
d) All the machinery is provided. You should be able to find this out. If you cannot, please comment, stating where you are getting stuck.