A standard result taught in mathematical statistics courses is that the multivariate gaussian only has a density if the covariance matrix is non singular: i.e. if $X \sim N_p(\mu, \Sigma)$ and $\Sigma$ is nonsingular, then the (Lebesgue) density is given by $$ f(x) = (2\pi)^{-p/2} |\Sigma|^{-1/2} \exp (-1/2 (x-\mu)^T \Sigma (x-\mu)). $$ Given an i.i.d. sample from a univariate Gaussian: $X_1,\dots,X_n \sim N(\mu, 1)$, and taking $\bar{X}$ be the sample average, it is straight forward to show that $(X_1,\dots,X_n)|\bar{X} \sim N(\bar{X} 1_n, I_n - n^{-1}1_n1_n^T) $ where $1_n$ is the $n$-dimensional vector of ones, and $I_n$ is the $n\times n$ identity matrix. Here it is straight forward to see that the covariance matrix is rank deficient and so $X|\bar{X}$ does not have a Lebesgue density.
My question is: more generally (in non Gaussian settings say), how can I verify whether a Lebesgue density exists for a given random vector? Is it sufficient compute the covariance matrix and verify that it is full rank? or are there cases where the covariance is full rank but the Lebesgue density still does not exist?