I am wondering whether $E[X\vert Y,Z]$ is a function of $Y,Z$ only? (That is, given that we don't know anything about $X$ besides that it is a R.V).
I am thinking yes, and my logic is that if $X$ is some function (a random variable), then it is a function of $Y,Z$ and possible other stuff. Once I condition on $Y,Z$ and take the expectation, though, anything that is not $Y,Z$ (the "possible other stuff") does not matter anymore, because we take the expected value (which is a constant).
That is, we integrate over all possible values of whatever is not $Y,Z$, and therefore all terms besides $Y,Z$ go away when we take $E[X\vert Y,Z]$
My motivation for the question is that if, for example $X= 2Y + Z +W^2$ where $W$ is also an R.V, then $$E[X\vert Y,Z] =E[2Y+Z+W^2\vert Y,Z] = 2Y + Z +E[W^2\vert Y,Z]$$ and I think $E[W^2\vert Y,Z]$ is just a constant after we compute the expectation (sum or integrate).
What makes me second guess that the answer is "yes" is that I am unsure whether the bounds of the integral $$E[W^2\vert Y,Z]=\int_a^b w^2 f_{W\vert \{Y,Z\}} dw$$ can maybe depend on some variable besides $Y,Z$ values.
Thanks.
The answer is yes, of course. Recall the definition of conditional expectation:
Existence and uniqueness in the $\mathcal L^2_+$ case are a consequence of the existence of orthogonal projections, while the general case is obtained by approximation.
The other crucial part to show what you are saying is the following theorem:
To prove your claim for one random variable, just observe that if you set $\mathcal A=\sigma (Y)$, then $\mathbb E[X\mid \mathcal A]$ is $\sigma(Y)$-measurable by definition. As a consequence of the theorem there exists a function $f$ such that $$\mathbb E[X\mid \mathcal A]=\mathbb E[X\mid Y]=f \circ Y$$ so the conditional expectation depends solely on $Y$.