Write the likelihood in terms of the unknown parameter β, $f_x$ and g

78 Views Asked by At

Consider $n$ i.i.d. pairs of random variables $(X_i, Y_i), i = 1, . . . , n,$ where $X_i ∈ R_p (p ≥ 1)$ and $Y_i ∈ R_p$. For each $i$, write $Y_i = X′_i\beta + \varepsilon_i$, where $E[\varepsilon_i] = 0, cov(X_i, \varepsilon_i) = 0$ and $\beta$ is an unknown vector, that we want to estimate. Assume that for all $x ∈ R_p$, $\varepsilon_1$ has a conditional density given $X_1 = x$, denoted by $f_x$ and that $X_1$ has a density, which we denote by $g$.

$Y_i = X′_i\beta + ε_i$;

$f(Y_i=y)=f(X′_i\beta+\varepsilon_i=y)=\sum_{i=1}^pf(X′_i\beta+\varepsilon_i=y|X_i=x)g(X_i=x)=\sum_{i=1}^pf(\varepsilon_i=y-x\beta|X_i=x)g(X_i)=\sum_{i=1}^pf(ε_i=y-x\beta|X_i=x)g(X_i)=f(\varepsilon_1=y-x\beta|X_1=x)g(X_1)+\sum_2^pf(\varepsilon_i=y-x\beta|X_i=x)g(X_i)=f_xg+\sum_2^pf(\varepsilon_i=y-x\beta|X_i=x)g(X_i)$

where I can go from here? Can I assume that $ε_i$ and $X_i$ are independent? I think, no since the fact that covariance is zero does imply independence only for linear variables, and $\varepsilon$ can be anything. $\prod_{i=1}^pf(Y_i=y)=\prod_{i=1}^p(f_xg+\sum_2^pf(\varepsilon_i=y-x\beta|X_i=x)g(X_i=x))$

Can you give me a hint?