Not sure where to start here.
A few things I know:
$$ E(Y_i) = E(\beta_0 + \beta_1X_i + \epsilon_i)$$
$$ E(Y_i) = \beta_0 + \beta_1X_i + E(\epsilon_i) $$
$$ E(Y_i) = \beta_0 + \beta_1X_i\,,$$ since $ E(\epsilon_i) = 0 $.
Then using the definition of Covariance in terms of expectation,
\begin{align} \operatorname{Cov}(Y_i, Y_j)&= E(Y_iY_j) - E(Y_i)E(Y_j) \\&= E(\beta_0 + \beta_1X_i + \epsilon_i,\beta_0 + \beta_1X_j + \epsilon_j ) - E(\beta_0 + \beta_1X_i + \epsilon_i)E(\beta_0 + \beta_1X_j + \epsilon_j) \\&= E(\beta_0 + \beta_1X_i + \epsilon_i,\beta_0 + \beta_1X_j + \epsilon_j ) - (\beta_0 + \beta_1X_i + \epsilon_i)(\beta_0 + \beta_1X_j + \epsilon_j) \end{align}
Not sure what to do from here. I don't know if this the right of wrong way to do this proof. Am I missing a an important assumption to finish this proof?
Recall that the covariance operator invarianet under shifting, that is $$ Cov(a+X, c + Y) = Cov(X,Y). $$ Thus if you assume that $X_i$ are constants, you have that
$$ Cov(Y_i, Y_j) = Cov(\epsilon_i, \epsilon_j)=0. $$