Let $A$ and $B$ be independent events. Show that $I_A+I_B$ and $|I_A−I_B|$ are uncorrelated. Are they independent?
I haven't quite figured out how to work with indicators and no matter how much I read about it in the book I still can't seem to grasp it. What does this actually mean and what am I supposed to do? I know that I somehow have to show that $\rho(A,B)=0$ but I have no idea how to compute the covariance,mean and variance of $A$ and $B$.
You are trying to prove something that is false.
Let $\Omega$ be your probability space. An event $A$ is a subset of $\Omega$ that belongs to its $\sigma$-algebra. (Don't worry about what that last but means if you don't know what a $\sigma$-algebra is.) $I_A$ is just a function on $\Omega$ with $I_A(\omega)=1$ for $\omega\in A$ and otherwise $I_A(\omega)=0$. Thus, $E(I_A+I_B)=P(A)+P(B)$, $E(I_A+I_B)^2=P(A)+P(B)+2P(A~AND~B)$, $E(|I_A-I_B|^2)=E(|I_A-I_B|)=P(A~XOR~B)$, and also $E((I_A+I_B)|I_A-I_B|)=E(|I_A-I_B|)=P(A~XOR~B)$ since $I_A+I_B=1$ whenever $|I_A-I_B|\ne 0$.
Thus $Cov(I_A+I_B,|I_A-I_B|)=E((I_A+I_B)|I_A-I_B|)-E(I_A+I_B)E(|I_A-I_B|)=P(A~XOR~B)-(P(A)+P(B))P(A~XOR~B)$.
Thus, in order for $I_A+I_B$ and $|I_A-I_B|$ to be uncorrelated, you do not need $A$ and $B$ independent. Rather, you need $P(A)+P(B)=1$ or $P(A~XOR~B)=0$. (The latter means $A=B$ (up to a set of measure zero--a comment I will omit from now on) or $A=B^c$) However, even in that case they will not in general be independent, since we have already seen $I_A+I_B=1$ whenever $|I_A-I_B|\ne 0$. Thus, for independence, you need $I_A+I_B$ or $|I_A-I_B|$ to be constant. This only happens when $A=B$ or $A=B^c$, but not in the other cases with $P(A)+P(B)=1$.
Note that the only cases in which A and B are independent and the functions in the problem are uncorrelated are when A and B are independent with probabilities summing to 1 (probably the case the question-writer had in mind and he forgot to check whether his idea worked for other cases....ooops) and the degenerate cases where $P(A)=P(B)=0$ or $P(A)=P(B)=1$.