Suppose that $X_1, X_2 \sim U(0,10)$ are continuous random variables. How would I derive the probability density function of $X_1 + X_2$? Clearly we have $$ f_{X_1}(x) = f_{X_2} = 0.1 $$ and so $$ f_{X_1+X_2}(x) = \int_{-\infty}^\infty f_{X_1}(s) \hspace{1mm} f_{X_2}(x-s) \hspace{2mm} ds $$ (if anyone knows what this rule is called, if it has a name, please let me know).
Now, since $f_{X_1}(x) = 0.1$ if $0 \leq x \leq 10$ and is $0$ otherwise, this integral can be simplified to $$ f_{X_1+X_2}(x) = 0.1 \int_0^{10} f_{X_2}(x-s) \hspace{2mm} ds $$
This is where I struggle to see what should happen next. Would anyone be able to help me to understand how to proceed from here?
Cheers.
$Y=X_1+X_2$
$f_Y(y)=0.1 \int_0^{10} f_{X_2}(y-s) ds$
we need to clarify the area $$0<y<20 \, \ 0<s<10 \, \ 0<y-s<10$$
Split in two situation $y<10$ and $10<y<20$ \begin{eqnarray} f_Y(y)&=&\left\{ \begin{array}{cc} 0.1 \int_0^{y} f_{X_2}(y-s) ds & y\leq 10 \\ 0.1 \int_{y-10}^{10} f_{X_2}(y-s) ds & 10<y<20 \\ 0 & O.W \end{array} \right. \\ &=&\left\{ \begin{array}{cc} 0.1 \int_0^{y} 0.1 ds & y\leq 10 \\ 0.1 \int_{y-10}^{10} 0.1 ds & 10<y<20 \\ 0 & O.W \end{array} \right. \end{eqnarray}