Apparently $E[X] = E[E[X\mid Y]]$ but I don't understand what this really means. I looked at https://en.wikipedia.org/wiki/Law_of_total_expectation but need another explanation.
Isn't this the same as $E[X] = E[X\mid Y]$? Why the extra $E[ \cdot ]$? And ultimately why does the $Y$ not seem to even matter if we say that $X$ depends on $Y$?
$\newcommand{\E}{\operatorname{E}}$The random variable $X$ has a conditional probability distribution given the event $Y=y$, for each value $y$ that the random variable $Y$ can take. Hence it also has a conditional expected value $\E(X\mid Y=y)$. This conditional expected value of course depends on $y$; thus we can write $\E(X\mid Y=y) = g(y)$.
The $g(Y)$ is a random variable, and we denote it $\E(X\mid Y)$.
As a concrete example, suppose five red marbles and three green marbles are in an urn, and you draw two of them without replacement. Let $Y$ be the number of red marbles on the first draw (either $0$ or $1$) and $X$ on the second. Then $$ Y = \begin{cases} 0 & \text{with probability } \dfrac 3 8, \\[6pt] 1 & \text{with probability } \dfrac 5 8. \end{cases} $$
\begin{align} \E(X\mid Y=0) & = \frac 5 7. \\[10pt] \E(X\mid Y=1) & = \frac 4 7. \end{align} Therefore $$ \E(X\mid Y) = \begin{cases} \dfrac 5 7 & \text{with probability } \dfrac 3 8, \\[6pt] \dfrac 4 7 & \text{with probability } \dfrac 5 8. \end{cases} $$ That's what $\E(X\mid Y)$ means. And with that probabilty distribution of the random variable $\E(X\mid Y)$, you can find $\E(\E(X\mid Y))$.