We said that two random variables $X,Y$ are independent iff we have that for $Z = X+Y$:
$$P_Z(B)=\int_{\mathbb{R}}P_X(B-s)dP_Y(s) = \int_{\mathbb{R}}P_Y(B-s)dP_X(s).$$ But I still don't get this notation. What does B-s mean? Somehow this seems as if we are subtracting a number from a set, which does not make that much sense. Can anybody here make this more rigorous?
I would say the definition given by Lipschitz is more general. $P_Z$ denotes the law of $Z$, or in other words, the probability measure.
I assume $X,Y:\Omega \rightarrow \mathbb{R}$: then when $X$ and $Y$ are absolutely continuous with respect to the usual Lebesgue measure, i.e. we can write: $P_X(B) = \int_B f_X(x)dx, \ B\subset \mathbb{R} $ and similarly for $Y$, then $X$ and $Y$ are independent if and only if the density of the sum is the convolution of densities, i.e. $$f_Z(z)=\int_{\mathbb{R}} f_X(z-x) f_Y(z)dz.$$
When we have two general r.v. $X,Y$ with respective laws $P_X$, $P_Y$ and $Z:=X+Y$ then $X,Y$ are independent if and only if the law of the sum is the convolution of the two measures, which is exactly what you wrote: take a set $B\subset \mathbb{R}^2$ and then the measure of this set, i.e. $P_Z(B):=P(\omega\in \Omega: \ Z(\omega)\in B)$ is given by $$P_Z(B) = \int_{\mathbb{R}} P_X(B-x)P_Y(dx).$$ Here, the notation "set minus point" means: $B-s := \{x-s: \ x\in B\}$ which is a set, and $P_X(B-s)$ denotes just the measure of this set, think of it as the set $B$ translated by $s$ units. For example: $B=[a,b]\subset \mathbb{R}$, then $B-s=[a-s,b-s]$ and $P_X(B)=P(X\in [a-s,b-s])$.