Im new to Statistic, please just dont blast me if you think my question is stupid :(
Image i take n random variables (IID and continuos) . Is there a theorem that assures that if n goes to infinite, the distribution of the values of these random variables follow exactly the probability density function ?
Almost. You have to translate your statement into probabilities and limits
For example if you have a random variable $X$ with a cumulative distribution function $F(x)$ then the probability $p_{x_0,\delta}$ that the random variable takes a value in the interval $(x_0-\delta , x_0+\delta]$ around a given $x_0$ is $p_{x_0,\delta}=\Pr(x_0-\delta \lt X_n \le x_0+\delta) = F(x_0+\delta)-F(x_0-\delta)$
Now let $I_{n}$ be the indicator random variable taking the value $1$ when $x_0-\delta \lt X_n \le x_0+\delta$ and the value $0$ otherwise. Then the laws of large numbers say that $\frac1n \sum_{j=1}^n I_j$ converges in probability and almost surely to $p_{x_0,\delta}$ as $n$ increases
If $X$ has a density function $f(x)$ then $p_{x_0,\delta}=\int_{x_0-\delta}^{x_0+\delta} f(x)\, dx$. If the density function is continuous at $x_0$ then it is the derivative of the cumulative distribution function and $f(x_0) =\lim_{\delta\to 0} \frac{p_{x_0,\delta}}{2\delta}$