I would like to show:
Let $X,Y$ be independent RVs. If there exists $c \in \mathbb R $ s.t. $P(X+Y=c)=1$, then $X,Y$ are constants a.s..
What I tried:
Since X,Y are indep,$$1=P(X+Y=c)=\int 1_{x+y=c} \, d \mu _x \, d \mu_y = \int P(X=c-y) \, d \mu_y$$ but it gets nowhere from here.
If $X$ and $Y$ are square integrable (*) then we may consider $Var(X+Y)$ to say:
$$0 = Var(X+Y) =Var(X)+Var(Y) \to 0 = Var(X)=Var(Y)$$
Otherwise:
Two integrals for two random variables!
$$1=P(X+Y=c)=\int\int 1_{x+y=c} \, d \mu _x \, d \mu_y $$
$$=\int\int 1_{d+y=c,x=d} \, d \mu _x \, d \mu_y (d \in \operatorname{Range}(X))$$
$$=\int\int 1_{d+y=c}1_{x=d} \, d \mu _x \, d \mu_y $$
$$=\int 1_{d+y=c}\int1_{x=d} \, d \mu _x \, d \mu_y $$
$$=\int 1_{d+y=c}\mu_x(x=d) \, d \mu_y $$
$$=\mu_x(x=d) \int 1_{d+y=c} \, d \mu_y $$
$$=\mu_x(x=d) \mu_y(d+y=c)$$
$$=\mu_x(x=d) \mu_y(y=c-d)$$
$$\to 1 =\mu_x(x=d) = \mu_y(y=c-d)$$
(*) Hmmm...I guess if $Z=c$ a.s. then $E[Z], E[|Z|], E[Z^2], Var(Z) < \infty$.
But if $\exists$ independent $X, Y$ s.t. $Z=X+Y$, then does that mean that $X$ and $Y$ are square integrable? I was thinking $\infty - \infty$, but I guess that's undefined. Thus, $X,Y < \infty$ a.s.
$$\to c=X+Y$$
$$\to c^2=(X+Y)^2$$
$$\to E[c^2]=E[(X+Y)^2]$$
$$\to c^2=E[X^2+2XY+Y^2]$$
$$\to c^2=E[X^2]+2E[XY]+E[Y^2]$$
$$\to c^2=E[X^2]+2E[X]E[Y]+E[Y^2]$$
$$\to E[X^2], E[XY], E[X]E[Y], E[X], E[Y], E[Y^2] < \infty$$