Let $X_k$ ($1 \leqslant k \leqslant n$) be real valued continuous random variables with joint probability distribution function $f_{X_1,\cdots,X_n}$ given by $$f_{X_1,\cdots,X_n}(x_1, \cdots ,x_n) = g_1(x_1) \cdots g_n(x_n)$$ where $g_1, \cdots, g_n$ are some functions.
Does anyone know how to show $g_k$ is the probability density of $X_k$ ($1 \leqslant k \leqslant n$)?
I am getting confused when it said $f_{X_1,\cdots,X_n}$ is the joint probability distribution function because we always assume $f_{X_1,\cdots,X_n}$ as density function.
As usual we can find the marginal density of $X_1$ by integrating out the remaining $n-1$ variables. So \begin{align*} f_{X_1}(x_1) &= \int_{-\infty}^{\infty} \dots \int_{-\infty}^\infty g_1(x_1)\dots g_n(x_n) \: dx_2\dots dx_n \\ &= g_1(x_1) \prod_{i=2}^n\int_{-\infty}^\infty g_i(x_i) \: dx_i \end{align*} And similar we get for $k=2,\dots,n$, that $$f_{X_k}(x_k) = g_k(x_k) \prod_{i\neq k} \int_{-\infty}^\infty g_i(x_i) \: dx_i.$$ If in addition we add the natural condition that $\int_{-\infty}^\infty g_i(x_i) \: dx_i = 1$ for $i=1,\dots,n$, then it can be seen that $f_{X_k}=g_k$.