Let $X =\{X_1,X_2,...,X_n\}^T$ such that $X_i \overset{iid}\sim N\left(\theta,1\right)$.
$Y_i=\begin{cases}1,X_i>0\\ 0,\text{otherwise} \end{cases}$
Let $\psi=P(Y_1 = 1)$
Find the maximum likelihood estimator (MLE) $\hat\psi$ of $\psi$.
I'm having trouble understanding what $Y$ represents here. Is it the mean of $X$? In which case would I use $\frac{1}{n}\sum_{i=1}^nX_i$ to estimate the answer?
$$\mathbb{P}[X_i>0]=\mathbb{P}[Z>-\theta]=\mathbb{P}[Z<\theta]=\Phi(\theta)$$
thus
$$Y_i=\mathbb{1}_{X_i>0}$$
and
$$\Psi=\Phi(\theta)$$
$\hat{\Psi}_{ML}$ can be derived using $\hat{\theta}_{ML}$ and its invariance property
Example:
You have the following random sample
$$\{2.95;1.19;2.60;1.10\}$$
your estimation is
$$\hat{\Psi}=\Phi(1.96)=0.975$$