Suppose you go to a bus stop. The inter-arrival time between successive buses is ${\rm Exp}(\lambda)$. You arrive at time zero and leave at time $t$. Let $Y$ denote the number of buses you saw. Find the distribution of Y.
I tried to find the probability $P(Y=n)$. Now, let $t_k$ be the inter-arrival time between $(k-1)$th and $k$th bus. So I figured that $Y$ will be equal to $n$ when $\large\sum_{k=1}^{n-1}t_k \lt t$ and $\large\sum_{k=1}^{n}t_k \ge t$. Thus,
$$ P(Y=n) = P \left(\sum_{k=1}^{n-1}t_k \lt t, \sum_{k=1}^{n}t_k \ge t \right) $$
Let $\tau_n = \sum_{k=1}^{n}t_k$. Now, here I am making an assumption which I am not sure of, that $\{\tau_{n-1} \lt t\}$ and $\{\tau_{n} \ge t\}$ are independent events. Using this, I concluded that: $$ P(Y=n) = P (\tau_{n-1} \lt t)\cdot P(\tau_{n} \ge t ) $$
Using properties of moment generating functions (MGF), we can show that ${\rm MGF}(\tau_n) = \dfrac{1}{(1-\frac{z}{\lambda})^n}$, so $\tau_n \sim {\rm Gamma}(n,\lambda)$.
So I concluded that: $$ P(Y=n) = \left( 1 - \frac{1}{\Gamma(n)}\gamma(n,t\lambda) \right)\cdot \left(\frac{1}{\Gamma(n-1)}\gamma(n-1,t\lambda) \right)$$
Is this solution correct? I am worried about my concepts, as well as my assumption of independence as said above.
It $\{\tau_{n-1}<t\}$ and $\{\tau_n\geq t\}$ would indeed be independent events then also $\{\tau_{n-1}<t\}^{\complement}=\{\tau_{n-1}\geq t\}$ and $\{\tau_n\geq t\}$ must be independent events which is evidently not the case because $\tau_{n-1}\geq t\implies\tau_n\geq t$.
You are dealing in this situation with a homogeneous Poisson point proces having rate $\lambda$ so that:$$P(Y=n)=e^{-\lambda t}\frac{(\lambda t)^n}{n!}\tag1$$
Note that: $$P(Y\leq n)=P(t_1+\cdots+t_n+t_{n+1}>t)$$where the $t_i$ are iid $\lambda$-exponential distributed random variables.
This can be exploited to find what is stated in $(1)$.
E.g. $P(Y=0)=P(Y\leq0)=P(t_1>t)=e^{-\lambda t}$.
Edit
I will handle here special case $\lambda=1$ for convenience.
There is no essential difference with the general case.
With induction we will prove that in special case $\lambda=1$ we have:$$P\left(t_{1}+\cdots+t_{n}>t\right)=e^{-t}\sum_{k=0}^{n-1}\frac{t^{k}}{k!}\tag1$$
The base case $n=1$ is evident: $P(t_1>t)=e^{-t}$ for exponential rv $t_1$ equipped with exponential distribution with parameter $\lambda=1$.
Now $(1)$ is our induction hypothesis and we calculate:$$\begin{aligned}P\left(t_{1}+\cdots+t_{n}+t_{n+1}>t\right) & =\int_{0}^{\infty}P\left(t_{1}+\cdots+t_{n}+t_{n+1}>t\mid t_{n+1}=x\right)e^{-x}dx\\ & =\int_{0}^{t}P\left(t_{1}+\cdots+t_{n}+x>t\mid t_{n+1}=x\right)e^{-x}dx+\int_{t}^{\infty}P\left(t_{1}+\cdots+t_{n}+x>t\mid t_{n+1}=x\right)e^{-x}dx\\ & =\int_{0}^{t}P\left(t_{1}+\cdots+t_{n}>t-x\right)e^{-x}dx+\int_{t}^{\infty}e^{-x}dx\\ & =\sum_{k=0}^{n-1}\int_{0}^{t}e^{-\left(t-x\right)}\frac{\left(t-x\right)^{k}}{k!}e^{-x}dx+\int_{t}^{\infty}e^{-x}dx\\ & =e^{-t}\sum_{k=0}^{n-1}\int_{0}^{t}\frac{\left(t-x\right)^{k}}{k!}dx+\int_{t}^{\infty}e^{-x}dx\\ & =e^{-t}\sum_{k=0}^{n-1}\left[-\frac{\left(t-x\right)^{k+1}}{\left(k+1\right)!}\right]_{0}^{t}+\left[-e^{-x}\right]_{t}^{\infty}\\ & =e^{-t}\sum_{k=1}^{n}\frac{t^{k}}{k!}+e^{-t}\\ & =e^{-t}\sum_{k=0}^{n}\frac{t^{k}}{k!} \end{aligned} $$
The third equality is based on independence of the $t_i$.
The fourth equality rests on $(1)$.
This leads to: $$P\left(Y\leq n\right)=P\left(t_{1}+\cdots+t_{n}+t_{n+1}>t\right)=e^{-t}\sum_{k=0}^{n}\frac{t^{k}}{k!}$$ and $$P\left(Y=n\right)=P\left(Y\leq n\right)-P\left(Y\leq n-1\right)=e^{-t}\frac{t^{n}}{n!}$$ in special case $\lambda=1$.
In the general case we get $$P\left(Y=n\right)=e^{-\lambda t}\frac{\left(\lambda t\right)^{n}}{n!}$$