- How does the 'integrablity' of a random variable have impact on 'conditioning' on a random variable or vice-versa?
- When do we treat one random variable as a constant or just as a random variable in the expectation of product of random variables?
- Is there any example, where law of total expectation is not applicable?
I was looking at this and this pages and was trying to figure out the answers to my question.
Here are the formulas of conditional expectation , $\mathbb{E}[\mathbb{E}\mathrm{[X|Y]}]=\mathbb{E}[\mathrm{X}]$, $\ \ \mathbb{E}[\mathbb{E}\mathrm{[Y|X]}]=\mathbb{E}[\mathrm{Y}]$ where we condition on the random variable $\mathrm{Y}$ and $\mathrm{X}$ respectively and 'conditioning' refers to the beliefs depend on the available information. Now here is how I interprete my answer to the first question-
We know $\mathbb{E}[\mathrm{Y}]=\begin{cases} \mathrm{\sum y_i}.\mathbb{P}\mathrm{(Y_i=y_i)} & \textrm{ for discrete case } \\ \int \mathrm{y. f(y) dy} & \textrm{ for continuous case } \end{cases}\bigg\}\\ \textrm{ Similarly } \\ \mathbb{E}\mathrm{[Y|X]}=\begin{cases} \mathrm{\sum y_i}.\mathbb{P_{Y|X}}\mathrm{(y|x)} & \textrm{ for discrete case } \\ \mathrm{\int y_i}.\mathbb{f_{Y|X}}\mathrm{(y|x)} & \textrm{ for continuous case } \end{cases}\bigg\} $
Integrable random variables are those r.vs who have finite expectations $\mathbb{E}\mathrm{|Y|}< \infty$. Hence $\mathbb{E}\mathrm{[Y|X]} <\infty$ and if the conditions over $\mathrm{X}$ are specified then the conditional expectation will lead to a deterministic value otherwise the conditional expectation will lead to some random quantity provided conditioning over $\mathrm{X}$ is not well-specified. In case of a discrete conditional expectation, the conditioned event has to be non-zero in order to be summable.
To answer the next question, I will look at these examples. If I assume $\mathrm{Z}=\mathbb{E}\mathrm{[Y|X]}$, where $\mathrm{Z}$ is a random variable, then $$\mathbb{E}[\mathrm{X}.\mathbb{E}\mathrm{(Y|X)}]=\begin{cases} \mathbb{E}\mathrm{[XZ]} & \textrm{ if } \ \mathrm{Z}=\mathbb{E}\mathrm{[Y|X]}\\ \mathbb{E}\mathrm{[X]}. \mathbb{E}\mathrm{[Y]} & \textrm{ if } \mathrm{Z}=\mathbb{E}\mathrm{[Y|X]}=c & \textrm{where c is a constant} \end{cases}\bigg\}\to \star$$
I think my first question is connected here because the algebra in ($\star$) depends on the degree of specification of condition on the target event.
For the third question if I consider that $\mathrm{X}$ denotes the probability of measuring a single point on a continuous distribution, then will the law of total expectation hold?
- I am not satisfied with my interpretations so would you kindly like to explain the facts that will lead to the answers to my question with complete interpretations. Also kindly correct me if anywhere I am wrong.
Any help, explanation is valuable and highly appreciated.
This is a quick summary of my comments: Let $X$ and $Y$ be random variables. Then $E[Y|X=x]$ is a number (or you can think of it as a function of $x$) whereas $E[Y|X]$ is a random variable (it is a function of $X$, hence, a random variable). You can define $E[Y|X]$ this way:
Define $g(x) = E[Y|X=x]$ for each value $x$ that the random variable $X$ can take.
Define $E[Y|X]$ as the random variable $g(X)$.
The law of total expectation, often called iterated expectations, is $E[Y]=E[E[Y|X]]$ and can equally be written:
Continuous case: $E[Y] = \int_{-\infty}^{\infty} E[Y|X=x]f_X(x)dx$.
Discrete case: $E[Y] = \sum_{x \in A} E[Y|X=x]P[X=x]$ (where $A$ is the set of values that $X$ can take.
The "integrability" condition $E[|Y|]<\infty$ is just a detail that ensures the expectation of $Y$ is well defined and finite.
Example:
Let $Y=X+U$ where $X, U$ are independent and $X \sim Bernoulli(1/2)$, $Y\sim Uniform[0,1]$. Then
$E[Y|X=0] = 0.5$.
$E[Y|X=1] = 1.5$.
$g(x) = E[Y|X=x] = x + 0.5$ for $x \in \{0,1\}$.
$g(X) = E[Y|X] = X + 0.5$. [this is a random variable]