$y_1,y_2,y_3$ were taken on the random Variables $Y_1,Y_2,Y_3$ where
$Y_1=\theta+e_1$
$Y_2=2\theta-\phi+e_2$
$Y_3=\theta+2\phi+e_3$
$E(e_i)=0,var(e_i)=\sigma^2,cov(e_i,e_j)=0$ for $(i\ne j)$ Find estimates for $\theta,\phi$
I don't unserstand why I need the last line (why I need covariance and varience of $e_i$) in this problem
What I did is $$Q=\sum_i (y_i-E(Y_i|x_i)^2$$ I minimised it by finding $dQ/d\theta=0$ and $dQ/d\phi=0$
Don't I need to find second derivatives to prove it's a minimum?
1) You do have to find the "second derivative", that in your case will be Hessian matrix w.r.t $\theta$ and $\phi$ and check whether it is positive definite matrix. Generally, once you can compute the OLS estimator, this Hessian is necessarily will be a positive definite matrix.
2) You need the finite variance if you want your estimators to be consistent. Generally, if you want that your OLS estimators will be efficient (BLUE), then you do need the assumptions of constant variance and uncorrelated errors. Otherwise, although you will be able to estimate the OLS vector, their variance can be inflated or they can be biased.