The response $Y$ in a regression is a Bernoulli r.v., express the log likelihood function and the part. derivatives

50 Views Asked by At

any help with these questions would be much appreciated, thank you.

Assume that the response variable $Y$ in a regression problem is a Bernoulli random variable, that is $Y$ ~ $Be(\pi(\beta'x))$, where $\pi(\beta'x)$ is a logistic function, $\beta'x=\beta_0+\beta_1x_1+\beta_2x_2+...+\beta_px_p$ and $x=(1,x_1,x_2,...,x_p)$, i.e., $Y$ follows a logistic regression.
Let $(x_1,y_1),(x_2,y_2),...,(x_n,y_n)$ be a data set of independent samples, where $y_i$ ∈ {$0,1$} and $x_i=(1,x_{i1},x_{i2},...,x_{ip}), i=1,2,...,n$.

a) Show that for all real $\beta$, the log likelihood function $l(\beta)$ can be written as
$l(\beta)=∑y_i\beta'x_i-∑ln(1+exp(\beta'x))$.

b) Find the partial derivatives $∂/∂β_0$ $l(\beta)$ and $∂^2/∂β^2_0$ $l(\beta)$ in the form they would appear in a recursive algorithm like Newton-Raphson for finding the maximum likelihood estimate.

1

There are 1 best solutions below

0
On
  1. $y_i \sim \mathcal{B}er(p_i)$, and $p_i \equiv p(x_i'\beta_i) = (1 + \exp\{-x_i'\beta\})^{-1} $ and $$ \ln p(x_i'\beta) =\ln \left(\frac{e^{x_i'\beta}}{1 +e^{x_i'\beta}} \right) = x_i'\beta - \ln (1+e^{x_i'\beta}) $$ thus $$ \mathcal{L}(\beta)=\prod_{i=1}^n p(x_i'\beta)^{y_i}(1-p(x_i'\beta))^{1-y_i} $$ $$ l(\beta) = \sum y_i \ln p(x_i'\beta_i) + \sum (1- y_i)\ln( 1- p(x_i'\beta_i)) $$ $$ l(\beta) = \sum y_i x_i'\beta - \sum y_i \ln(1 + e^{x_i'\beta}) $$

  2. Show that $$ p(x_i'\beta)'_{\beta_0} = p_i(1-p_i) = \frac{1}{1+e^{x_i'\beta}}\times\frac{e^{x_i'\beta}}{1+e^{x_i'\beta}} $$