How to make sense of $E[X | Y]$? where $X$ and $Y$ are independent random variables.
Now if we consider for a particular value of $y$ which $Y$ takes
$$ \begin{align} E[X | Y = y] &= \sum_x x P(X = x | Y = y) \\ &= \sum_x x f_{X|Y}(x, y) \\ &= \sum_x x \frac{P(X = x, Y = y)}{P(Y = y)} \\ &= \sum_x x P(X = x) \\ &= E[X] \end{align} $$
How do we go about and prove that $E[X | Y] = E[X]$ ? i.e, true for any value $y$? It doesn't really make sense to write $P(Y)$ If we proceed with similar proof as above,
$$ \begin{align} E[X | Y] &= \sum_x x P(X = x | Y) \\ &= \sum_x x f_{X|Y}(x) \\ &= \sum_x x \frac{P(X = x, Y)}{P(Y)} \\ \end{align} $$
I am stuck at this point. Thanks.
$\mathbb E[X\mid Y]$ is a random variable measurable wrt to the $\sigma$-algebra generated by random variable $Y$.
That means that a Borelmeasurable function $f:\mathbb R\to\mathbb R$ exists with $\mathbb E[X\mid Y]=f(Y)$.
So what you actually found out in the case of independent $X$ and $Y$ is: $$f(y)=\mathbb E[X\mid Y=y]=\mathbb EX$$i.e. the function $f$ is constant and takes value $\mathbb EX$ for every $y$.
If $Z$ is a degenerated random variable with $Z(\omega)=c$ for every $\omega\in\Omega$ then this can be expressed as: $$Z=c$$Applying this here we get:$$\mathbb E[X\mid Y]=\mathbb EX$$where the LHS is a random variable and the RHS is an element of $\mathbb R$.