I am currently working on a modified version of the this paper. In section V-B, I am stuck on the Item-Parameter Estimation of the Log-Likelihood function for the Item-Response Theory Model.
I will briefly summarize the problem as stated in the paper.
Given the log-likelihood function,
$$L=\sum_{g=1}^K[r_{ig} logP_i(\theta_{g})+ (f_g-r_{ig})log(1-P_i(\theta_g))]$$
where$$P_{ij}=\frac {1}{1+e^{-\alpha_i(\theta_j-\beta_i)}}$$
find the item parameters $\xi_i=(\alpha_i,\beta_i)$ that maximize the log-likelihood function.
For this the Newton-Raphson method is used, which, given the partial derivatives
$L_1=\frac {\partial L}{\partial \alpha_i}$ , $L_2=\frac {\partial L}{\partial \beta_i}$ , $L_{11}=\frac {\partial^2 L}{\partial \alpha_i^2}$ , $L_{22}=\frac {\partial^2 L}{\partial \beta_i^2}$ , $L_{12}=L_{21}=\frac {\partial^2 L}{\partial \alpha_i\beta_i}$
estimates the parameters iteratively as follows:
$$\left[ \begin{matrix} \hat{\alpha_i}\\ \hat{\beta_i}\\ \end{matrix} \right]_{t+1} = \left[ \begin{matrix} \hat{\alpha_i}\\ \hat{\beta_i}\\ \end{matrix} \right]_{t} - \left[ \begin{matrix} L_{11}&L_{12}\\ L_{21}&L_{22}\\ \end{matrix} \right]_{t}^{-1} \times \left[ \begin{matrix} L_{1}\\ L_{2}\\ \end{matrix} \right]_{t} $$
Now, each of the above partial derivatives would be some N dimensional vector. How does one go about evaluating the inverse and subsequently a multiplication of such a matrix? Moreover, will solving the equation for each $i^{th}$ entity separately be the same as solving the entire equation in one go?
Last, (Assuming I am permitted to ask this question here), are there any packages available in (Python/Matlab/Java/C++) that can help to implement the above solution conveniently?
Thanks in advance.