Suppose $Y_1,Y_2...,Y_n$ are random variables $ε_1,ε_2...,ε_n$ are i.i.d with mean 0 and unknown variance $\hatβ_0 , $$\hatβ_1 $$,\hatσ^2$ are maximum likelihood estimators
Given $Y_i=β_0+β_1*x_i + ε_i $
How would you proof
$Ε[($$\hatβ$$_0$+$\hatβ$$_1$$)$$*$$\hatσ^2$] = $(β_0 + β_1)*σ^2$ (unbiased estimator)
Is the following correct ?
$Ε[($$\hatβ$$_0$+$\hatβ$$_1$$)$$*$$\hatσ^2$] = $Ε[E[($$\hatβ$$_0$+$\hatβ$$_1$$)$$*$$\hatσ^2$|$\hatσ^2$]] = $E[\hatσ^2$] * $E[\hatβ$$_0 + \hatβ$$_1 ] $ = $(β_0+β_1) * σ^2 $