$X$ is a random variable following $N(\theta,1)$
$\hat{\theta}$ is a maximum likelihood estimator of $\theta$ and $b(\theta)=E[\hat{\theta}]-\theta$.
I need to prove $b(\theta) \geq 0$ and express a function $f(\theta)$ with $\theta,b(\theta)$, where $$f(\theta)=E[(X-\theta)^2]-E[(\hat{\theta}-\theta)^2]-E[(X-\hat{\theta})^2]$$
We also have
$$\phi (x) = \frac{1}{\sqrt{2\pi}} \exp\{-\frac{x^2}{2}\}, \Phi(x)=\displaystyle \int_{-\infty}^{x} \phi (y) dy$$
I got $\hat{\theta}$ as $\frac{\displaystyle \sum_{i=1}^n x_i}{n}$, but I found out that I am not given $n$ nor $X_i,x_i$.
If I am correct, $b(\theta)$ is going to be 0, and I don't think that is a conclusion they want.
For $f(\theta)$,
I opened up expected value as $$E(X^2)-E(2X\theta)+E(\theta^2)-E(\hat{\theta}^2)+2E(\hat{\theta}\theta)+E({\theta^2})-E({X^2})+E({2X\hat{\theta}})+E({\hat{\theta^2}})$$
but I could not find out how to deal with some terms especially $E(2X\hat{\theta}),E(\hat{\theta}^2)$
In short, your answer is correct: the MLE for $\theta$ is unbiased.
Remark 1. The question not explicitly stating $n$ (and, correspondingly, the observations $X_1, \ldots, X_n$) can be interpreted in one of two ways: (i) $n$ is arbitrary or (ii) $n=1$.
Remark 2. The MLE requires at least one sample from the distribution: it is, by definition, the estimator which maximizes the likelihood. It doesn't make sense to talk about the likelihood without any samples.
Proof. Given $X_{1},\ldots,X_{n}\sim N(\theta,1)$, the likelihood is $$ \mathcal{L}(\theta)=\prod_{i}\varphi(X_{i})\propto\exp\left(-\frac{1}{2}\sum_{i}\left(X_{i}-\theta\right)^{2}\right). $$ Taking logarithms, $$ \log\mathcal{L}(\theta)=-\frac{1}{2}\sum_{i}\left(X_{i}-\theta\right)^{2}+\text{const.}=-\frac{1}{2}\left(n\theta^{2}-2\theta\sum_{i}X_{i}\right)+\text{const.} $$ Taking derivatives, $$ \frac{d}{d\theta}\left[\log\mathcal{L}(\theta)\right]=-n\theta+\sum_{i}X_{i}. $$ Setting this to zero and solving for $\theta$ yields the max likelihood estimator $$ \hat{\theta}=\frac{1}{n}\sum_{i}X_{i}. $$ This estimator is unbiased because $$ b(\theta)=\mathbb{E}\hat{\theta}-\theta=0. $$