First, an explanation for those who are unfamiliar with the XdY dice format.
$$ XdY \sim \sum_{i=1}^X\text{Uniform}\{1, 2, ..., Y\} $$
It is the sum of $X$ discrete uniformly distributed random variables from $1$ to $Y$ inclusively. Equivalently, it's the total result a $Y$-sided die rolled $X$ times.
For $X = 1$ or $2$, they're fairly straightforward, but as $X$ increases, calculating the convolutions gets more and more out of control.
Is there a closed form for this family of distributions?
Short answer: not really.
Longer answer:
Unfortunately, there is no "simple closed form" expression for say, the probability that $XdY$ equals $Z$. Barring this, there are a couple of questions you can ask instead:
How hard is it to compute the probability distribution exactly? That is, even if there is no closed form for the probability distribution of XdY, perhaps there is a simple sum or algorithm that we can use to evaluate it.
What does the probability distribution look like? That is, even if we can't compute the probability distribution exactly (or if it is tedious to do so), can we approximate it with a simple distribution?
Luckily, there are good answers to both questions!
Computing the probability distribution exactly
One good way to do this is to use generating functions. For convenience, instead of letting each die have the numbers $1$ through $Y$ on it, let's let each die have the numbers $0$ through $Y-1$ on it (to convert this back to the original setup, it suffices to just add $X$ to your total; this will be slightly nicer to work with mathematically).
The generating function corresponding to the possible outcomes from the roll of one die is given by $F(x) = 1 + x + x^2 + \dots + x^{Y-1} = \frac{X^Y - 1}{X - 1}$. It then immediately follows from the properties of generating functions that the possible outcomes from rolling $X$ dice are described by the generating function $F(x)^X$. In particular, the number of ways to roll a total of $Z$ is given by the coefficient of $x^Z$ in $F(x)^X$; the probability you roll a $Z$ is this coefficient divided by $Y^X$.
Can we get a more explicit form for this coefficient? Well, note that we can write $F(x)^X$ in the form
$$F(x)^X = (x^Y - 1)^X (x-1)^{-X}$$
Now, we can use the binomial theorem to evaluate the coefficients of $(X^Y - 1)^X$ and $(X-1)^{-X}$. In particular, we can rewrite this again as
$$F(x)^X = \left(\sum_{i=0}^{X}\binom{X}{i}x^{iY}(-1)^{X-i}\right)\left(\sum_{j=0}^{X}\binom{-X}{j}x^j(-1)^{X-j}\right)$$
The coefficient of $x^Z$ in this expression is equal to
$$[x^Z] F(x)^X = \sum_{i=0}^{X}\binom{X}{i}\binom{-X}{Z-iY}(-1)^{Z+i-iy}$$
and therefore the probability of obtaining a roll of $Z$ is
$$\mathrm{Pr}[XdY = Z] = \frac{1}{Y^X}\sum_{i=0}^{X}\binom{X}{i}\binom{-X}{Z-iY}(-1)^{Z+i-iy}$$
Further simplified: $$\mathrm{Pr}[XdY = Z] = \frac{1}{Y^X}\sum_{i=0}^{X}\binom{X}{i}\left(\kern-.3em\binom{X}{Z-iY}\kern-.3em\right)(-1)^{i}$$
$\left(\kern-.3em\binom{n}{k}\kern-.3em\right)$ is the multiset coefficient.
This is pretty much as simplified as you can make it; a sum involving only $X$ terms, each of which are pretty straightforward to compute.
Computing the probability distribution approximately
In many applications (particularly when $X$ is large), however, it might suffice to simply approximate the distribution (perhaps you simply want the probability you roll more than 4000 when rolling 1000 6-sided dice). In this case, as $X$ increases, the Central limit theorem says that the distribution $XdY$ approaches a normal distribution with the same mean and variance as $XdY$. More accurately, it says that if $\mu$ is the mean of $1dY$ (so $\mu = \frac{1+Y}{2}$) and if $\sigma^2$ is the variance of $1dY$ (so $\sigma^2 = \frac{Y^2-1}{12}$), then the random variable
$$\sqrt{X}\left(\frac{1}{X}XdY - \mu\right)$$
approaches a Gaussian random variable with mean $0$ and variance $\sigma^2$ as $X \rightarrow \infty$.