I have a problem involving the combination of a continuous random variable and a discrete random variable, where the continuous variable also serves as a parameter to the distribution of the discrete variable.
Let's denote the continuous variable by $T$, whose probability density function is some given function $f:\mathbb{R}\to[0,1]$. Furthermore, let's denote the family of discrete random variables by $X_t$, whose probability mass functions are some given family of functions $g_t:\mathbb{Z}\to[0,1]$, where $t\in\mathbb{R}$ is a parameter.
I now have the following event $A$: \begin{equation} A=\bigcup_{t\in[t_1,t_2]}\left(T=t\cap X_t=k\right), \end{equation} where $t_1,t_2\in\mathbb{R}$ and $k\in\mathbb{Z}$ are given and $t_2>t_1$. Also, the events $T=t$ and $X_t=k$ are given to be independent for all $t\in[t_1,t_2]$.
What I want to know, is how to calculate the probability of $A$. My intuition tells me that \begin{equation} P(A)=\int_{t_1}^{t_2}f(t)g_t(k)\,dt. \end{equation}
To get there, I did the following:
- I interpreted $P(T=t)$ as the infinitesimal probability $dP(T<t)=\frac{dP(T<t)}{dt}dt$
- I used the fact that $T=t$ and $X_t=k$ are independent, so $P(T=t\cap X_t=k)=P(T=t)P(X_t=k)$.
- I used the fact that all the events in the union in the definition of $A$ are mutually exclusive, so we may sum, or integrate, the probabilities.
The first of these steps is not rigorous at all. It is at best an intuitive idea (although I am far from certain whether it can even be considered correct).
My question essentially is, how do I (somewhat) rigorously find an explicit formula for $P(A)$? I should note that I have only very little experience with formal probability theory, so please excuse my perhaps incorrect use of terms or concepts.
Thank you!