Show an example of random variables $X$ and $Y$ such that $X$ and $Y$ are not independent but still $$\textbf{E}(X\mid Y) = \textbf{E}X$$
I tried looking at the simplest discrete probability distributions but when I tried finding $p_1, p_2, p_3$ and $p_4$ such that $$\textbf{P}(X = 0, Y = 0) = p_1,\quad \textbf{P}(X = 0, Y = 1) = p_2\\ \textbf{P}(X = 1, Y = 0) = p_3, \quad\textbf{P}(X = 1, Y = 1) = p_4$$ and $\textbf{E}(X\mid Y) = p_3 + p_4$ it always resulted in the independency of $X$ and $Y$.
Simplest example: $Y$ is Bernoulli with values $0$ or $1$ each with probability $\frac{1}{2}$. If $Y = 0$, then $X$ is also Bernoulli with the same distribution. If $Y=1$, then $X$ is uniform on $(0,1)$.
Clearly $X$ is not independent of $Y$. Yet $$ E(X|Y=0) = E(X|Y=1) = \frac{1}{2} $$