Suppose $X_1, X_2, \ldots, X_n$ are $IID$ normal RVs with mean $\mu$ and variance $1$. However, we observe only $Y_i$'s where $Y_i = \max (0, X_i)$. I would like to know how to write likelihood function for $\mu$ given $Y_i$'s. Since, $Y_i$ has both discrete and continuous components, I tried to write it in the following way
$$L(\mu \mid Y_1,Y_2,\ldots, Y_n) = \frac{1}{(2\pi)^\frac{m}{2}}\exp\left(-\frac{1}{2}\sum_1^m (Y_i-\mu)^2 \right) \left(\Phi(-\mu)\right)^{n-m}$$
where $Y_1, Y_2, \ldots, Y_m > 0$ and $Y_{m+1}=Y_{m+2}=\cdots=Y_n=0$ and $\Phi$ is CDF of standard normal.
I am not sure if this is correct way of writing it. I would be thankful if anyone can direct me to any references on how to write likelihood functions when the distribution of data has both discrete and continuous components.
Let's assign a measure $m$ to Borel subsets of the half-open interval $[0,\infty)$ by specifying that the measure of every open interval is its length and $m(\{0\})=1$, and measures of all other Borel sets are accordingly determined. Let $f$ by a probability density with respect to the measure $m$, so that \begin{align} & \int_{[0,\infty)} f(x)\,dm(x) = \int_{(0,\infty)} f(x)\,dm(x) + \int_{\{0\}} f(x)\,dm(x) \\[10pt] = {} & \int_{(0,\infty)} f(x)\,dm(x) + f(0)m(\{0\}) = \int_{(0,\infty)} f(x)\,dm(x) + f(0). \end{align} For a random variable $X$ having this distribution, we have $$ \Pr(X=0) = \int_{\{0\}} f(x)\,dm(x) = f(0)m(\{0\}) = f(0). $$ Let $$ f_\mu(x) = \begin{cases} \displaystyle \varphi(x-\mu) & \text{if } x\ne 0, \\[10pt] \Phi(-\mu) & \text{if }x=0, \end{cases} $$ (where $\Phi$ and $\varphi=\Phi'$ are the standard normal c.d.f. and p.d.f. respectively). This is the density for the observations you describe, and you can define the likelihood function $\mu\mapsto L(\mu)$ accordingly.
Let us see what would happen if we had used a different measure with respect to which our desired probability distribution has a density. What if we had said $m(\{0\})=1/2$ and left the rest of the definition as above? Then we would have $$ L(\mu) = \frac{1}{(2\pi)^{m/2}}\exp\left(-\frac{1}{2}\sum_1^m (Y_i-\mu)^2 \right) \left(2\Phi(-\mu)\right)^{n-m} $$ with the additional factor of $2^{n-m}$. But it is still proportional to $$ \mu \mapsto \exp\left(-\frac{1}{2}\sum_1^m (Y_i-\mu)^2 \right) \left(\Phi(-\mu)\right)^{n-m}. $$ With likelihood functions, the proportionality class is all that matters, and a certain amount of seeming arbitrariness in the choice of the initial measure does not change that.