Let $U_1 \dots U_n $ be i.i.d sample from an exponential distribution with parameter $\lambda$. Let $$Y_i =\min{(U_i,c)}$$ where $c$ is a known constant. Find the likelihood for $\lambda$.
To do this I tried to find the distribution of $Y_i$, and I did it as follows: $$P(Y_i \ge y )= \begin{cases} 1,& {\rm{if }}\; y \leq c & \\ 1-e ^{-\lambda y }, &{\rm{if }}\; y > c \end{cases} $$ Then I tried to compute the density of $Y_i$, but I'm not sure it is correct, I found that it is: $$\frac{1}{c}\mathbf{1}_{\{y \leq c\} } (y) + \lambda e^{-\lambda y } \mathbf{1}_{\{y > c \}} $$ Is it correct ? And how can I go on with this ?
It might help to be slightly more careful. Let's compute the min as $$\min(U_i,c) =c\mathbb{1}_{\{U_i\geq c\}} + U_i \mathbb{1}_{\{U_i < c\}} $$ Thus $Y_i = c$ with probability $e^{-\lambda c}$ and $Y_i = U_i$ with probability $1-e^{-\lambda c}$. Now we can ask what $F(y)= P(Y_i\leq y)$ is. If $y\geq c$ then clearly $F(y)=1$, but if $y < c$ we can compute by partitioning based on $U_i$ $$F(y) = P(Y_i < y, U_i \leq c) + P(Y_i < y, U_i > c) = P(U_i \leq y) + P(\emptyset)=1-e^{-\lambda y}$$ So the distribution of $Y_i$ is $$F(y) = \begin{cases} 1, &y\geq c\\ 1-e^{-\lambda y}, &0<y< c \end{cases}$$ and since the distribution jumps at $y=c$ there is no density (there is one in the measure-theoretic sense but we won't go there), but the "likelihood" of an observation can still be thought of as $$L_i(\lambda|y) = \begin{cases} \lambda e^{-\lambda y}, & 0< y < c\\ e^{-\lambda c}, & y=c\\ 0, &y >c \end{cases} $$ now use independence and take the product to get $$ L(\lambda|y_1,\ldots,y_n) = \prod_i L_i(\lambda,y_i)$$ which will of course depend on where the $y_i$'s land with respect to the conditions on $L_i$.