Let X be a continuous random variable with the density function:
$f_x(x) = \begin{cases} x+1, & \text{if}\ -1\leq x \leq 0 \\ -x +1, & \text{if}\ 0 \leq x < 1\\ 0 & \text{otherwise} \end{cases} $
The expected value of this is 0, as $E[x] = \int_{-1}^{0}x(x+1)dx + \int_{0}^{1}x(-x+1)dx = 0$
My question is: When calculating the variance using: $Var(X) = E[X^2] - (E[X])^2$
Do I simply compute the above-mentioned but with this distribution function:
$f_x(x^2) = \begin{cases} x^2+1, & \text{if}\ -1\leq x \leq 0 \\ -x^2 +1, & \text{if}\ 0 \leq x < 1\\ 0 & \text{otherwise} \end{cases} $
Also, is the general approach?
No.
If the distribution of $X$ is known and has PDF $f_X$ then for a suitable function $g:\mathbb R\to\mathbb R$ we find:$$\mathbb Eg(X)=\int g(x)f_X(x)dx\tag1$$
This equality carries the name "The law of the unconscious statistician" and also "Transfer Theorem" (see comment of Florian).
You can apply that here to find $\mathbb EX^2$ where function $g$ is obviously prescribed by $x\mapsto x^2$ and $f_X$ is the density described in your question.
Also have a look here.
For more info about $(1)$ you can take a look at (the answer on) this question.