I have a question about the problem mentioned above, the main says
$X$ Has distribution $ U(0,1)$ and $Y$ has distribution $ U(0,X)$ Find $E(Y)$ and $Var(Y)$
I try to take it for $E[Y|X]=X$ and $Var[Y|X]=X$ as random variables, then we have
$E[Y]=E[E[Y|X]]=E[X]=\frac{1-0}{2}= \frac{1}{2}$
$Var(Y)=E[Var[Y|X]]+Var[E[Y|X]]= E[X]+Var[X]= \frac{1}{2} +\frac{1}{12} = \frac{7}{12}$
What do you think, if my attempt is wrong, what would you do?
As you are already familiar with the law of total expectation and the law of total variance, the only flaw in your reasoning is the improper evaluation of the conditional expectation and variance of $Y \mid X$.
To understand how these are wrong, let's phrase the problem another way. Suppose instead $$Y \sim \operatorname{Uniform}(0,b)$$ where $b > 0$ is some fixed constant. How would you calculate $\operatorname{E}[Y]$ and $\operatorname{Var}[Y]$? Well, this is just the special case of a general continuous uniform random variable on the interval $(a,b)$, where $a = 0$. It is a straightforward exercise to show $$\operatorname{E}[Y] = \frac{0 + b}{2} = \frac{b}{2}, \\ \operatorname{Var}[Y] = \frac{(b - 0)^2}{12} = \frac{b^2}{12}.$$
So, you are told that the conditional distribution of $Y$ given $X$ is uniform on $(0,X)$. The fact that $X$ is itself a random variable changes nothing with respect to the expectation and variance of the conditional distribution. So we must have $$\operatorname{E}[Y \mid X] = \frac{X}{2}, \quad \operatorname{Var}[Y \mid X] = \frac{X^2}{12}.$$ All we did was replace $b$ with $X$, resulting in functions of the random variable $X$ rather than some fixed constant. Then, to compute the unconditional expectation and variance of $Y$, we compute the moments of the respective functions of $X$, in accordance with the laws you already know.