let $X, Y$ be two independant random variables such that : $X \stackrel{}{\sim} \mathscr{Bernoulli(p)}$ and $Y \stackrel{}{\sim} \mathscr{Binomial(n,p)}$
let $S = X + Y$ then $S \stackrel{}{\sim} \mathscr{Binomial(n+1,p)}$
how do do you find the joint distribution of $(X,S) ?$
here's a try (check if there's no mistakes) :
$$\begin{align} \mathbb{P}(X=x,S=s) & = \mathbb{P}(X=x,Y=s-x)\\ &= \mathbb{P}(X=x,Y=y) \; \color{red}{\text{is this step correct ?}} \\ & = \mathbb{P}(X=x)\mathbb{P}(Y=y) \; \color{red}{\text{because they're independant}}\\ &= p^x(1-p)^{1-x} {{n}\choose{y}}p^{y}(1-p)^{n-y} \\ &=p^x(1-p)^{1-x} {{n}\choose{s-x}}p^{s-x}(1-p)^{n-s+x} \\ &= {{n}\choose{s-x}}p^{s}(1-p)^{n-s+1} \end{align}$$
I'm also interested in other methods of solving this.
any comments or hints will be appreciated.
EDIT : basically what's bothering me the most is the equality : $\mathbb{P}(X=x,S=s) = \mathbb{P}(X=x,Y=y)$
The steps you question are quite correct. You are temporarily substituting $y= s-x$, which is quite okay, and indeed the independence of $X,Y$ does mean you can use the product rule.
However, please don't forget to indicate the support for the joint probability mass function.$$(X,S)\in\{(x,s): x\in\{0,1\}, s\in\{x, x+1,...,n+x\}\}$$
An alternative approach from first principles: $S$ is the count of successes among $n+1$ independent and identically distributed Bernoulli trials (with success rate $p$), and $X$ the indicator that the last trial is a success. So $\{X=x,S=s\}$ is the event of $s-x$ successes among the first $n$ trials, and $x$ successes among the last; giving $n+1-s$ failures among the $n+1$ trials. Noting that $x\in\{0,1\}$ and $s-x\in\{0,...,n\}$ then we have:$$\mathsf P(X=x,S=s)=\binom{n}{s-x}p^s(1-p)^{n+1-s}\mathbf 1_{x\in\{0,1\},s-x\in\{0,...,n\}}$$