Convolution formula, trouble with limits

128 Views Asked by At

Good day. I have been working on this for some time and seem to keep stumbling. I think I have a grip on this concept but my trouble seems to lie in the limits of integration. So here goes:

Given X and Y, independent uniformly distributed random variables: $$fx(X) = 1_{(0,1)}(x)$$ $$fy(Y) = 1_{(1,2)}(y)$$ $$f_y(Y) = f_{w-x}(w-x)$$ $$W = X + Y$$ $$ f_w{(W)} = 1$$

I drew a sketch to help and it seems that integrating on x, my limits would be $0$ and $(w-y)$. Integrating on y, my limits would be $1$ and $(w-x)$. The convolution formula, $$f(w) = \int_{-^\infty}^{\infty} f_x(x)f_y(w-x)dx $$ since it's entirely in terms of $x$, I would integrate between $0$ and $(w-y)$.

I understand I am starting with marginal pdfs and need the joint pdf. I'm just not quite grasping the mechanics of the math. I should have four pieces to this entire thing...that is, four areas where $w$ is equal to $1$ and $0$ everywhere else. Or I'm very wrong. Suggestions please?

**Edit @gt6989b If X is uniform over $(0,1)$ and Y is uniform over $(1,2)$ my limits would be different? I'm fairly certain since this is supposed to be a joint distribution, Y has to be taken into account.

**Edit Two: Given the pdf (from below) $$f_w(w) = \int_{0}^{1} 1_{(w-2,w-1)}(x)dx$$ I need to find the E(x) and Var(x)

2

There are 2 best solutions below

1
On

You are saying $X,Y \sim \mathcal{U}(0,1)$ and $W = X+Y$, and you are looking for the pdf of $W$.

Note first that intuitively, if $x,y\in(0,1)$, then $x+y \in (0,2)$, so we expect $0 < W < 2$. More formally, $$ \begin{split} F_W(w) &= \mathbb{P}[W<w] = \mathbb{P}[X+Y<w]\\ &= \int_0^1 \lim_{\epsilon \to 0^+} \mathbb{P}[X+Y<w, |X-x|<\epsilon] \ dx\\ &= \int_0^1 \lim_{\epsilon \to 0^+} \left( \mathbb{P}[Y<w-x] \cdot \mathbb{P}[|X-x|<\epsilon] \right) \ dx\\ &= \int_0^1 \mathbb{P}[Y<w-x] \cdot \left( \lim_{\epsilon \to 0^+} \mathbb{P}[|X-x|<\epsilon]\right) \ dx\\ &= \int_0^1 F_Y(w-x) f_X(x) \ dx, \end{split} $$ which is exactly your convolution.

1
On

First, a brief rederivation of the convolution rule: because $X$ has a density, we can abuse notation to write \begin{align*} \mathbb{P}[W\leq w]&=\int_{\mathbb{R}}{\mathbb{P}[X+Y\leq w|X=x]\,d\mathbb{P}[X=x]} \\ &=\int_{\mathbb{R}}{\mathbb{P}[Y\leq w-x]\,d\mathbb{P}[X=x]} \end{align*} Letting $\Phi_A$ be the c.d.f. of a random variable $A$, we have $$\Phi_W(w)=\int_{\mathbb{R}}{\Phi_Y(w-x)\,d\Phi_X(x)}$$ Assuming our random variables have densities, we may differentiate under the integral sign by Leibniz's rule; letting $f_A$ be the density of $A$, we have $$f_W(w)=\int_{\mathbb{R}}{f_Y(w-x)f_X(x)\,dx}=(f_Y*f_X)(w)$$

Second, let's just plug in. Because MathJax doesn't do blackboard-bold "1"s very well, I'm going to use $\chi$ for characteristic functions. We have $$f_W(w)=\int_{\mathbb{R}}{\chi_{(1,2)}(w-x)\chi_{(0,1)}(x)\,dx}$$ The integrand vanishes when $x\notin(0,1)$, so we have $$f_W(w)=\int_0^1{\chi_{(1,2)}(w-x)\,dx}=\int_0^1{\chi_{(w-2,w-1)}(x)\,dx}$$

Now you have to break down into cases. It helps to draw pictures; I'm going to sweep $w+(-2,-1)$ right-to-left, paying attention to when the endpoints hit $\partial(0,1)$.

If $w\leq1$, then the integrand always vanishes on the domain of integration. If $1\leq w\leq2$, then $(w-2,w-1)\cap(0,1)=(0,w-1)$, so $$f_W(w)=\int_0^{w-1}{dx}=w-1$$ If $2\leq w\leq 3$, then $(w-2,w-1)\cap(0,1)=(w-2,1)$, so $$f_W(w)=\int_{w-2}^1{dx}=3-w$$ And if $3\leq w$, then the integrand vanishes again.

Putting it all together, $$f_W(w)=\begin{cases} 0&w\notin[1,3] \\ w-1&w\in[1,2] \\ 3-w&w\in[2,3] \end{cases}$$