Let $X_{x,1}$ and $X_{x,2} $be two unbiased estimators for a real parameter $X$. Let us define $X_{x}=\alpha X_{x,1}+\beta X_{x,2}$ with $\alpha,\beta\in \mathbb R$.
For which values of $\alpha$ and $\beta$ is the estimator $X_{x}$ unbiased?
For which value(s) of $\alpha$ and $\beta$ is the estimator $X_{x}$ found above of minimal variance? We assume that $X_{x,1}$ and $X_{x,2}$ are independent and that $\mathrm{Var}(X_{x,1})=\mathrm{Var}(X_{x,2})=\sigma^2$
The notation here is pretty suboptimal. But the idea here is as follows: in order for $X_x$ to unbiased for some (fixed) parameter $X$, we would desire that $$ X=E(X_x)=E(\alpha X_{x,1}+\beta X_{x,2})=\alpha E(X_{x_,1})+\beta E(X_{x,2})=(\alpha+\beta)X. $$ The last equality above uses the unbiasedness of $X_{x,1}$ and $X_{x,2}$. So the answer is: if $X=0$, then for all $\alpha$ and $\beta$, $X_x$ is unbiased for $X$; on the other hand, if $X\neq 0$, then $X_x$ is unbiased for $X$ if and only if $\alpha+\beta=1$.
For simplicity, I will assume away the probably unintended case of $X=0$. Then: $$ \operatorname{Var}(X_x)=(\alpha^2+\beta^2)\sigma^2. $$ Given $\alpha+\beta=1$, can you minimize $$ \alpha^2+\beta^2=\frac{1}{2}(\alpha+\beta)^2+\frac{1}{2}(\alpha-\beta)^2? $$