Convexity and likelihood

69 Views Asked by At

I have the following function:

$-logL(\beta|t,X)= -\sum_{n=1}^{N} \{t_n \cdot log(y_n) +(1-t_n)\cdot log(1-y_n)\}$

where $y_n=(1+exp(-x_n'\beta))^{-1}$ and where $\beta$ and $x_n$ are both kx1 vectors and $t$ is a vector of zeros (corresponding to $y_n=0$) and ones (corresponding to $y_n=1$) and $X$ is a NxK matrix if stacked $x_n$ vectors. Basically it is an application of the maximum likelihood method where $x$ represents data and $y_n$ represents the probability of succcess of a binary variable.

I need to show that the given function (that I need to minimize with repsect to beta) is convex such that the extremum is a minimum. I know the ruls about the hessian matrix but I have troubles in applying them. The hessian is given by

$H= \sum_{n=1}^{N} \{y_n \cdot (1-y_n)x_nx_n'=X'RX$

where $R$ is a diagonal matrix with entries $R_{nn}=y_n \cdot (1-y_n)$