Let $N, X1, X2,...$ be independent stochastic variables where $N\sim Poisson(\lambda)$ and $X_i\sim Bernoulli(p)$ for $i = 1, 2,...$, with $\lambda>0$ & $0<p<1$. The Compound Poisson variable $X$ is given by \begin{equation} X=\sum_{j=1}^{N}X_j \end{equation}
Assignment: Find the distribution for $X$.
Attempted Solution: My reasoning is as follows
Each of the variables $X_j$ may assume values $x=1$ or $x=0$ with respective probabilities $p$ and $1-p$. This way the variable $X$ counts the number of "succesful" attempts, up to $N$ trials.
I think the chance to get $P(X=k)=P\left(\sum_{j=1}^{N}X_j=k\right)$ should be the chance for $P\left(\sum_{j=1}^{n}X_j=k\right)$ multiplied by $P(N=n)$. That is; it's the chance to get $k$ successes in $n$ attempts, multiplied by the chance to have those $n$ tries.
So this leads me to \begin{align} P(X=x) &= P\left(\{N=n\}\cap \left\{\sum_{j=1}^{n}X_j=k\right\}\right)\\ &= P(N=n)\cdot P\left(\sum_{j=1}^{n}X_j = k\right) \text{Because they're independent}\\ \end{align}
Then I use the fact that a sum of (indepedent, parameter $p$) Bernoulli trials will follow a Binomial distribution. Thus \begin{align} P(X=x) &= \frac{e^{-\lambda}\lambda^n}{n!} \cdot (_{k}^{n})p^k(1-p)^{n-k}\\ &= \frac{e^{-\lambda}\lambda^n}{n!} \cdot \frac{n!}{k!(n-k)!} p^k(1-p)^{n-k}\\ &= \frac{e^{-\lambda}p^k\lambda^n}{k!} \cdot \frac{(1-p)^{n-k}}{(n-k)!} \end{align} Where I used the fact that $N\sim Poisson(\lambda)$ to find $P(N=n)$.
Thus concluding $$P_X(x) = \begin{cases} \frac{e^{-\lambda}p^k\lambda^n}{k!}\cdot\frac{(1-p)^{n-k}}{(n-k)!} & \text{for }k=1,2,3,...\\ 0 & \text{Otherwise} \end{cases}$$ I am, however, convinced I've gotten something wrong. Namely the fact that $n$ still shows up in the expression, though I can't quite tell what I'm missing.
You went the right away! But you have to consider all cases of $n\geq k$. Let $A_n=\{N=n\}$ for all $n\geq 0$. Observe that all $A_n$'s are disjoint and their union is $\Omega$. Also, observe that $P(\{X=k\}\cap A_n)=0$ when $n<k$. Now fix $k\geq 0$ and write
\begin{align} P(X=k)&=P\bigl(\bigcup_{n=0}^{\infty}\{X=k\}\cap A_n\bigr)=\sum_{n=0}^{\infty}P(\{X=k\}\cap A_n)\\ &=\sum_{n=k}^{\infty}P(X=k\ |\ N=n)\cdot P(N=n) \end{align}
Now $P(X=k\ |\ N=n)={n\choose k}p^k(1-p)^{n-k}$ and $P(N=n)=e^{-\lambda}\frac{\lambda^n}{n!}$. Putting all together, for $k\geq 0$
$$P(X=k)=\sum_{n=k}^{\infty}{n\choose k}p^k(1-p)^{n-k}e^{-\lambda}\frac{\lambda^n}{n!}$$