Zero Eigenvalue in covariance matrix

510 Views Asked by At

I have an $n \times n$ covariance matrix $\Sigma$ of jointly normal random variables $X_1, ..., X_n$ , I need to find $\mathbb{P}(X_n> c| X_1 = x_1,..,X_{n-1}= x_{n-1})$. This requires finding $\Sigma^{-1}$. The problem is that one of the eigenvalues (say the $i^{\rm th}$) is $\approx 0$, i.e., $\lambda_i \approx 0$.

So I just removed the row and the column (and the observation $x_{i}$) that correspond to that $\lambda_i$. Now I know that I should do eigen decomposition and possibly take the eigen values and vectors correspond to the $i^{\rm th}$ eigenvalue. But that requires additional processing.

So I am wondering if what I have done earlier is acceptable approximation.

Thanks