Find discrete random variables $Y,X$ such that $$E(X\vert Y)=E(X)\quad \text{and}\quad E(Y\vert X)\neq E(Y)$$
I'm pretty sure I can do this with indicator functions. So my attempt:
$X\sim Ber(1/4)\quad$ and $\quad Y=1_{X=1}$
$E(X|Y)=E(X)=1/4$
$$E(Y)=P(X=1)\neq E(Y|X=k)=\frac{E(1_{X=k}1_{X=1})}{P(X=x)}$$
Here I'm stucked. How do I compute $E(Y\vert X)$? What confuses me is that $1_{X=k}1_{X=1}=1_{X=k\cap X=1}=1_{k=1}$. So what is $P(k=1)$?
My second question is concerning independance. I often struggle to see whether two events are independent or not.
For instance if I have $Z=X+Y$ and $X,Y$ are independent. It's clear that $Z$ is not independent of $X$. But is $X$ independent of $Z$? If I write $X=Z-Y$ Then it looks like it is dependent.
First, I'm going to explain conditional expectation using the random variables you've given.
So, $E[Y|X]$ is a random variable, but $E[Y|X=x_i]$ is a number, which I guess you already know is defined as $$E[Y|X=x_i] := \frac{E[Y1_{X=x_i}]}{P(X=x_i)}$$
In your case we have
$$E[Y|X] = E[Y|X=1]1_{X=1} + E[Y|X=0]1_{X=0}$$
while
$$E[Y] = P(X=1) = \frac14$$
If $X=1$, then
$$E[Y|X] = E[Y|X=1]1_{X=1}$$
$$= \frac{E[Y 1_{X=1}]}{P(X=1)}1_{X=1}$$
$$= \frac{E[Y (1)]}{(1/4)}(1)$$
$$= 1$$
$$\ne E[Y]$$
If $X=0$, then
$$E[Y|X] = E[Y|X=0]1_{X=0}$$
$$= \frac{E[Y 1_{X=0}]}{3/4}1_{X=0}$$
$$= \frac{E[Y (1)]}{(3/4)}(1)$$
$$= \frac{E[(0) (1)]}{(3/4)}(1)$$
$$= 0$$
$$\ne E[Y]$$
Thus, we have $$E[Y] \ne E[Y|X]$$
In fact, we have $$E[Y|X] = Y$$
In advanced probability theory, this is because $Y$ is $X$-measurable.
In elementary probability theory (read: intuitively), this is because if we know $X$, then we know $Y$ so there is no 'expectation'.
As for $E[X|Y]$, we have
$$E[X|Y] = E[X|Y=1]1_{Y=1} + E[X|Y=0]1_{Y=0}$$
If $Y=1$, then
$$E[X|Y] = E[X|Y=1]1_{Y=1}$$
$$= \frac{E[X 1_{Y=1}]}{P(Y=1)}1_{Y=1}$$
$$= \frac{E[X 1_{X=1}]}{P(X=1)}(1)$$
$$= \frac{E[(1) 1_{X=1}]}{1/4}(1)$$
$$= \frac{P(X=1)}{1/4}(1)$$
$$= 1$$
If $Y=0$, then
$$E[X|Y] = E[X|Y=0]1_{Y=0}$$
$$= \frac{E[X 1_{Y=0}]}{P(Y=0)}1_{Y=0}$$
$$= \frac{E[X 1_{X=0}]}{(3/4)}(1)$$
$$= \frac{E[(0) 1_{X=0}]}{(3/4)}(1)$$
$$= 0$$
Thus $E[X|Y]$ is not constant. Actually, $E[X|Y] = Y$
I am not sure how to explain this intuitively.
Therefore, we do not have $E[X] = E[X|Y]$
Actually, the random variables $X$ and $Y$ are equal, if I'm not mistaken. At the very least they are almost surely equal, meaning $P(X = Y) = 1$.
Let me be a little more precise without using (too much?) advanced probability.
Suppose we have a (discrete) probability space $(\Omega, \mathbb P)$.
Let $X \sim \text{Ber}(\frac14)$. One such $X$ is $$X=1_A, \ \text{where} A \subseteq \Omega, P(A) = \frac14$$
Let $Y=1_{X=1}$.
I am claiming that if we collect all the sample points $\omega$ in $\Omega$ into the event $\{X(\omega) = Y(\omega)\} (\subseteq \Omega)$, then $\{X(\omega) = Y(\omega)\}$ is the entire sample space $\Omega$ itself, if I'm not mistaken. At the very least $P(\{X(\omega) = Y(\omega)\}) = 1$
Pf:
Consider any sample point $\omega$ in the sample space $\Omega$.
Case 1: $\omega \in A$
Then $X(\omega) = 1_A(\omega) = 1$
In other words,
$$\omega \in \{X(\omega) = 1\}$$
$$\to Y(\omega) = 1_{X=1}(\omega) = 1$$
Thus $X=Y (=1)$ if $\omega \in A$.
Case 2: $\omega \notin A$
Then $X(\omega) = 1_A(\omega) = 0$
In other words,
$$\omega \notin \{X(\omega) = 1\}$$
$$\to \omega \in \{X(\omega) \ne 1\}$$
$$\to \omega \in \{X(\omega) = 0\} \tag{1}$$
$$\to Y(\omega) = 1_{X=1}(\omega) = 0$$
Thus $X=Y (=0)$ if $\omega \notin A$.
QED...?
The possible flaw in the above reasoning is that an alternative definition of $X$ may make $(1)$ false. Consider a different $X$ s.t.
$$X=1_A + 10 \times 1_B$$
where $P(A)=\frac14$ and $P(B)=0$ so X technically could be $10$ for some $\omega$'s in $\Omega$, but there are too few of those sample points to the extent that $P(X=10)=P(\{X(\omega) = 10\})=P(\{\omega | X(\omega) = 10\})=0$.
I think all this depends on how we define our sample space. If we toss a 6-sided die, is a roll of 7 going to be part of the sample space $\Omega$ as follows:
$$\{\text{roll of 1}, ..., \text{roll of 6}, \text{roll of 7}\}$$
with $P(\text{roll of 7}) = 0$? Or do we say that
$$\Omega = \{\text{roll of 1}, ..., \text{roll of 6}\}$$
?
Here, $\text{roll of 7} \notin \Omega \to \text{roll of 7} \in \emptyset \to P(\text{roll of 7}) = 0$.
FYI, it can actually be shown that
$$E[X|Y] = Y \ \text{and} \ E[Y|X] = X \to P(X=Y) = 1$$
Proving it is unbelievably complicated if we do not assume $E[X^2], E[Y^2] < \infty$.
Otherwise, we can show $E[(X-Y)^2] = 0$ (assuming we know $E[E[X|Y]]=E[X]$, which in turn assumes we know what the Hell '$E[E[X|Y]]$' means) to get $X-Y=0 \ \text{a.s.}$
In your case, we can show that $E[X|Y] = Y$. We have so far only that $E[Y|X] = Y$. I'm not sure how we can show $E[Y|X] = X$ without using $X=Y$.
For completeness, I'm going to give an answer based on another answer.
Consider discrete probability space $(\Omega, \mathbb P)$, and let $A_1, A_2, A_3$ be events ($A_i \subseteq \Omega$) s.t. $P(A_i) = \frac13$.
Define $X=1_{A_1}-1_{A_2}+0 \times 1_{A_3}$ and $Y=1_{X=1} + 1_{X=-1}+0 \times 1_{X=0}$
Obviously, $E[X]=0$ and $E[Y]=\frac23$
It can be shown that
$$E[X|Y] = 0$$
Thus
$$E[X|Y] = E[X]$$
This means that knowledge of $Y$ doesn't change what we expect $X$ will be, on average:
If $Y=0$, then definitely $X=0$. So obviously on average, $X=0$. Precisely: $X=0 \to E[X] = 0$.
If $Y=1$, then $X$ is either $1$ or $-1$ but with equal probability so on average, $X=0$. Precisely: $P(X=1|Y=1) = P(X=-1|Y=1) \to E[X|Y=1] = 0$
Also,
$$E[Y|X] = 1_{X=1} + 1_{X=-1}$$
This means
$E[Y|X]$ is not constant and hence $\ne E[Y]$. Specifically, $E[Y|X]$ varies w/rt to X meaning knowledge of $X$ changes we expect $Y$ will be, on average:
$$E[Y|X] = Y$$
As with earlier and as can be seen from 1, if we know $X$, then we know $Y$ so there is no 'expectation'.