I am trying to find the determinant of a $N^2\times N^2$ matrix of the form $$\boldsymbol{A} = \begin{pmatrix} \boldsymbol{A}_0^N & I_N & \cdots & I_N \\ I_N & \boldsymbol{A}_1^N & \cdots & I_N \\ \vdots & \vdots & \ddots & \vdots \\ I_N & I_N & \cdots & \boldsymbol{A}_{N-1}^N \end{pmatrix},$$ where $diag(\boldsymbol{A})=(\boldsymbol{A}_0^N,\boldsymbol{A}_1^N,...,\boldsymbol{A}_{N-1}^N)$, $\boldsymbol{A}_i^N$ is a $N\times N$ matrix and $N\times N$ identity matrices everywhere else. I saw a paper "DETERMINANTS OF BLOCK TRIDIAGONAL MATRICES" by Luca Guido Molinari where he lets $\Psi$ be a vector such that $$\boldsymbol{A}\Psi=0,$$ where the components of $\Psi$ are $\psi_k\in\mathbb{C}^N$, where $k=0,1,2,...,N-1$. In the paper in Lemma 1, he derives two boundary terms and one term which he transforms into a transfer matrix. With my case I get a series of equations of the form: $$\boldsymbol{A}_0^N\psi_0+\psi_1+\cdots+\psi_{N-1}=0$$ $$\psi_0+\boldsymbol{A}_1^N\psi_1+\psi_2+\cdots+\psi_{N-1}=0$$ $$\vdots$$ $$\psi_0+\psi_1+\psi_2+\cdots+\boldsymbol{A}_{N-1}^N\psi_{N-1}=0.$$ I'm not sure if there is a way to turn this into a transfer type matrix. The only form I can find is $$\psi_{k}\boldsymbol{A}_k^N+\sum_{j=0,j\neq k}^{N-1}\psi_j=0$$ for each $k$. I don't know if this is the way to go about solving this problem. I have tried row manipulation, and induction but still no luck, any pointers?
EDIT: Let $\tilde{I}^N_j=diag(0,..,0,1,0,...,0)$ where the $1$ is in the $j+1$-th diagonal and is a $N\times N$ matrix. Therefore $$\boldsymbol{A}=\sum_{j=0}^{N-1}\tilde{I}^N_j\otimes\boldsymbol{A}^N_j+M_N\otimes I_N.$$ Let $$M_N = \begin{pmatrix} 0 & 1 & \cdots & 1 \\ 1 & 0 & \cdots & 1 \\ \vdots & \vdots & \ddots & \vdots \\ 1 & 1 & \cdots & 0 \end{pmatrix},$$ is a matrix with $diag(0,0,...,0)$ and ones everywhere else. Now, Let $P$ be such that $D=diag(\lambda_1,...,\lambda_N)=PM_NP^{-1}$. Therefore we have $$\det(\boldsymbol{A})=\det\left((P\otimes I_N)(\sum_{j=0}^{N-1}\tilde{I}^N_j\otimes\boldsymbol{A}^N_j+M_N\otimes I_N)(P^{-1}\otimes I_N)\right)=\det(\sum_{j=0}^{N-1}P\tilde{I}^N_jP^{-1}\otimes\boldsymbol{A}^N_j+D\otimes I_N).$$ To find $P$ and $P^{-1}$ we need to find the eigenvalues and eigenvectors of $M_N$.
We need $$\det(M_N-\lambda I_N)= \begin{pmatrix} -\lambda & 1 & \cdots & 1 \\ 1 & -\lambda & \cdots & 1 \\ \vdots & \vdots & \ddots & \vdots \\ 1 & 1 & \cdots & -\lambda \end{pmatrix}.$$ So, adding all the rows underneath row 1 to row 1 i.e., $r1+r2$, $r1+r3$... We end up with $$\det(M_N-\lambda I_N)= \begin{pmatrix} -\lambda+N-1 & -\lambda+N-1 & \cdots & -\lambda+N-1 \\ 1 & -\lambda & \cdots & 1 \\ \vdots & \vdots & \ddots & \vdots \\ 1 & 1 & \cdots & -\lambda \end{pmatrix}.$$ Now subtract from column 2, column 1, and from column 3 subtract column 1 etc. Therefore we have $$\det(M_N-\lambda I_N)= \begin{pmatrix} -\lambda+N-1 & 0 & \cdots & 0 \\ 1 & -\lambda-1 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 1 & 0 & \cdots & -\lambda-1 \end{pmatrix},$$ Therefore $$\det(M_N-\lambda I_N)=(-\lambda+N-1)(-\lambda-1)^{N-1},$$ thus we have one eigenvalue $\lambda=N-1$ and $N-1$ eigenvalues $\lambda=-1$. Confirming we look at the trace, $Tr(M_N)=0$ and (using the same row and column manipulations) $\det(M_N)=(-1)^{N-1}(N-1)$, since $Tr(M_N)=\sum_{l}\lambda_l=(N-1)+\sum_{j=1}^{N-1}(-1)=N-1-(N-1)=0$ and $\det(M_N)=\prod_{l}\lambda_l=(N-1)(-1)^{N-1}$. Now we compute the eigenvectors. With $\lambda=N-1$, one finds the first eigenvector is $\boldsymbol{v}_1=(1,\cdots,1)^{T}$. Since we have degeneracies we have to be careful. Since for $\lambda=-1$ we only have one non-zero row and therefore $x_1+x_2+\cdots +x_N=0$ where $\boldsymbol{v}=(x_1,\cdots,1,x_N)^{T}$ is an eigenvector. Thus we pick $\boldsymbol{v}_2=(1,-1,0,\cdots,0)^T$, $\boldsymbol{v}_3=(1,0,-1,0,\cdots,0)^T$,..., $\boldsymbol{v}_N=(1,0,0,\cdots, 0,-1)^T$. Therefore $P=(\boldsymbol{v}_1\hspace{2mm}\boldsymbol{v}_2\cdots\hspace{2mm}\boldsymbol{v}_N)$ is an $N\times N$ matrix, with columns as the eigenvectors $\boldsymbol{v}_i$. $P^{-1}$ is found by row manipulation as is $$P^{-1}=\frac{1}{N}\begin{pmatrix} \boldsymbol{u}_1 \\ \boldsymbol{u}_2 \\ \vdots \\ \boldsymbol{u}_N \end{pmatrix},$$ where $\boldsymbol{u}_1=(1,...,1),\boldsymbol{u}_2=(1,1-N,1,..,1), \boldsymbol{u}_3=(1,1,1-N,1,..,1), ...,\boldsymbol{u}_N=(1,...,1,1-N)$ or when $j>1$ there is a $N-1$ in the $j$-th position of $u_j$ and ones everywhere else. Now, $$P\tilde{I}_jP^{-1}=\frac{1}{N}\boldsymbol{v}_{j+1}\boldsymbol{u}_{j+1},$$ so for example $$P\tilde{I}_1P^{-1}=\frac{1}{N}\boldsymbol{v}_2\boldsymbol{u}_2=\frac{1}{N}\begin{pmatrix} \boldsymbol{u}_2 \\ -\boldsymbol{u}_2 \\ \boldsymbol{0}\\ \vdots \\ \boldsymbol{0} \end{pmatrix}=\begin{pmatrix} 1 & 1-N & 1& \cdots & 1 \\ -1 & N-1&-1 & \cdots & -1 \\ 0 & 0&0 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots&0 \\ 0 & 0 & \cdots & 0 &0 \end{pmatrix},$$ where $\boldsymbol{0}$ is a row of $N$ zeros.
Therefore $$\sum_{j=0}^{N-1}\frac{1}{N}\boldsymbol{v}_{j+1}\boldsymbol{u}_{j+1}\otimes\boldsymbol{A}^N_j+D_N\otimes I_N=\boldsymbol{L}.$$ This is as far as I've got, I now need to find the summation on the left hand side then take the determinant, I can find the eigenvalues of $\boldsymbol{A}_j^N$ if that helps?