We are given random variables $X_k=I_k \cdot B_k$, where $I_k$ is Bernoulli with parameter $p$ and $B_k$ geometric with parameter $q$. So it holds: $$P(X_k=i)= p \cdot (1-q)^{i-1} \cdot q $$
Consider: $$Y_k =c+k-\sum_{j=1}^k X_k$$ and the stopping time: $$\tau=\inf\{ n\geq 1: Y_n \leq 0\} $$ for $k \in \mathbb{N}$ and $c\geq 1$
I want to prove: $$P(\tau=k, -Y_\tau=j)=P(\tau=k) \cdot (1-q)^j q$$
My approach: Rewrite
$$P( -Y_{\tau}=j \mid \tau=k) \cdot P(\tau=k) =\\ P( -Y_k=j) \cdot P(\tau=k)$$
Consider the first factor: The definition of $\tau$ tells us that:
$$Y_{k-1}>0$$ and therefore $$X_k > Y_{k-1}>0$$
This implies:
$$P(-Y_{k-1} + X_k-1=j \mid X_k > Y_{k-1}>0) \\= P(X_k =1+j+Y_{k-1}\mid X_k > Y_{k-1}>0, X_k>0) \\=P(X_k=j+1, X_k>0) = (1-q)^j q$$
Can I argue like that?
Edit: $$P( -Y_{k}=j \mid \tau=k) \overset{!}{=}P ( -Y_{k}=j )$$ The events are not independent, but if I plug in $\tau=k$, the event $\{\tau=k\}$ would not give further information?
Here is the proof: Allthough not directly stated, I am going to assume $j\in\mathbb{N}$. First of all, as you have correctly followed: $$\textbf{P}(-Y_\tau=j,\tau=k)=\textbf{P}(-Y_k=j,\tau=k)$$ Now we proof the claim by induction over $k\in\mathbb{N}$
Let $k=1$. In this case, $$\textbf{P}(-Y_k=j,\tau=k)=\textbf{P}(Y_1=-j,\tau=1)=\textbf{P}(Y_1=-j),$$ since $Y_1=-j$ implies $\tau=1$. Since \begin{align*}\textbf{P}(\tau=1)&=\textbf{P}(Y_1\leq0)=\textbf{P}(c+1-X_1\leq0)=\textbf{P}(X_1\geq c+1)\\&=1-\textbf{P}(X_1\leq c)=1-\sum^c_{i=0}\textbf{P}(X_1=i)\\&=1-(1-p)-\sum^c_{i=1}\textbf{P}(X_1=i)\\&=p-pq(1-(1-q)^c)/(1-(1-q))=p(1-q)^c\end{align*} and \begin{align*}\textbf{P}(Y_1=-j)=\textbf{P}(c+1-X_1=-j)=\textbf{P}(X_1=c+1+j)=pq(1-q)^{c+j},\end{align*} the claim follows for $k=1$.
We are trying to prove that $k\Rightarrow k+1$. Since the structure of random walks is inherently recursive, conditioning on the first step should work (let the index of the probability measure denote the start of the random walk):
\begin{align*}\textbf{P}_c(\tau=k+1)&=\sum^{c}_{i=0}\textbf{P}_c(\tau=k+1|X_1=i)\textbf{P}_c(X_1=i)=\sum^{c}_{i=0}\textbf{P}_{c-i+1}(\tau=k)\textbf{P}_c(X_1=i)\\&=\sum^{c}_{i=0}\textbf{P}_{c-i+1}(\tau=k,Y_k=-j)\textbf{P}_c(X_1=i)(1-q)^{-j}/q\\&=\sum^{c}_{i=0}\textbf{P}_{c}(\tau=k+1,Y_{k+1}=-j|X_1=i)\textbf{P}_c(X_1=i)(1-q)^{-j}/q\\&=\textbf{P}_c(\tau=k+1,Y_{k+1}=-j)(1-q)^{-j}/q.\end{align*}
The first equation follows from the law of total probability, the second uses the fact that $(Y_n)_{n\in\mathbb{N}}$ is a homogenous markov chain, the third uses the induction hypothesis. From there we repeat all of theses steps in reverse.