Here is a question from the book An Introduction to The Theory of Numbers by Ivan Niven.
Suppose that $\mathbb{L}$ contains $2n$ elements, and that $\mathbb{L}$ is partitioned into $n$ disjoint subsets each one containing exactly two elements of $\mathbb{L}$. Show that this can be done in precisely $$(2n-1)(2n-3)\cdots 3\cdot1 = \frac{(2n)!}{2^nn!}$$ ways.
My Question
I do not understand how did he moved to this expression $(2n-1)(2n-3)\cdots 3\cdot1$ since, up to my understanding, choosing $n$ subsets of $2n$ set with $2$ elements each is $\Pi_{k=0}^n {2n\choose 2}$. Furthermore, how $$(2n-1)(2n-3)\cdots 3\cdot1 = \frac{(2n)!}{2^n n!}??$$
First you select a pair out of $2n$ (there are $\binom{2n}{2}$ possibilities).
Then you select a pair out of the remaining $2n-2$ (there are $\binom{2n-2}{2}$ possibilities).
Et cetera.
That gives $\binom{2n}2\binom{2n-2}2\cdots\binom42\binom22=\frac{(2n)!}{2!2!\cdots2!}=\frac{(2n)!}{2^n}$ possibilities.
But like this every partition has been counted $n!$ times so we must divide by $n!$ to repair that.
This leads to $$\frac{(2n)!}{2^nn!}$$ partitions.