I have a n x m matrix X and a m x m symmetric matrix V. I can find the determinant of X'X and I can find the determinant of V in the usual ways. What I want to know, but I can't remember my matrix algebra well enough, is the following: is the determinant of X'(V^-1)X (I use V^-1 to denote the inverse of V) always equal to the determinant of X'X divided by the determinant of V? Can someone show me a proof using matrix algebra? I have a feeling there is one if we express X and V in terms of their usual matrix triple products.
The background to this is that X is here a set of predictors in a multiple linear regression, Y=XB+e, Y is the response variable, B is the vector of regression coefficients, and e is a vector of normally distributed errors. The solution for B is B=(X'X)^-1X'Y. If X'X doesn't have an inverse (i.e., determinant = 0) then the regression model cannot be uniquely estimated as there is at least one exact linear dependency among the predictors. The usual setting for a multiple linear regression assumes that the errors in the vector e are independent and identically distributed. I wondered if the statement above about the inverse of X'X still held if the observations of the outcome variable are correlated. In this circumstance we have to be able to invert (X'V-1X), where V is the symmetric matrix of correlations among the errors, i.e., ee'. I wondered if including V made any difference. That is, if X'X doesn't have an inverse, will (X'V-1X) ever have an inverse, remembering that V is a symmetric matrix. I think the answer is 'no' -- including V doesn't change things, but this depends upon being able to prove the question about the determinant of the product of several matrices that I outlined above.
Thank you!