Given a continuous multivariate pdf in analytical form (i.e. in function form), how can one sample from the corresponding distribution? In other words, what are the ways of coming up with random (or psuedo-random) realizations from the distribution with probability matching the one corresponding to the given pdf? What is the general idea/principle behind the sampling procedure?
Referring to related references would be helpful too.
This is a broad topic. Here are a few examples to initiate some thinking on your part.
(1) To generate a random sample $x$ from a univariate distribution with CDF $F$, first generate a uniformly distributed random number $u \sim U(0,1)$ and take $x = F^{-1}(u)$. Note that $x$ has the desired distribution since
$$\mathbb{P}(x \leqslant a) = \mathbb{P}(F^{-1}(u) \leqslant a) = \mathbb{P}(u \leqslant F(a)) = F(a)$$
(2) To generate a random vector with a multivariate normal distibution, i.e., $\mathbf{x} \sim N(\mathbf{\mu}, \Sigma)$, first generate a vector $\mathbf{z}$ with components that are indpendent and distributed $z_j \sim N(0,1)$. Find the Cholesky decomposition of the covariance matrix $\Sigma = LL^T$ and take $\mathbf{x} = \mu + L\mathbf{z}$.
This imposes the desired covariance structure since
$$\mathbb{E}((\mathbf{x} - \mathbf{\mu})(\mathbf{x} - \mathbf{\mu})^T) = \mathbb{E}(L\mathbf{z}(L\mathbf{z})^T)= \mathbb{E}(L\mathbf{z}\mathbf{z}^T L^T) = L \mathbb{E}(\mathbf{z}\mathbf{z}^T)L^T = LL^T = \Sigma$$
(3) More generally, consider an approach like Gibbs sampling.
Also you could explore the notion of a copula if the marginal distributions are accesible.