In the context of general relativity, I want to prove that $$ g^{ij} = \frac{1}{g(n-1)!}\epsilon^{i \beta_2 \ldots \beta_n} \epsilon^{j \alpha_2 \ldots \alpha_n}g_{\beta_2\alpha_2}\ldots g_{\alpha_n\beta_n} $$
where $g:=\det (g_{\mu\nu})$.
I start with \begin{equation} g = \epsilon^{\alpha_1\ldots\alpha_n}g_{1\alpha_1}\ldots g_{n\alpha_n} = \frac{1}{n!}\epsilon^{\beta_1\ldots\beta_n}\epsilon^{\alpha_1\ldots\alpha_n}g_{\beta_1\alpha_1}\ldots g_{\beta_n\alpha_n} \end{equation}
So
\begin{align*} \partial_{\mu}g &= \frac{1}{n!}\epsilon^{\beta_1\ldots\beta_n}\epsilon^{\alpha_1\ldots\alpha_n} (\partial_\mu g_{\beta_1\alpha_1})\ldots g_{\beta_n\alpha_n} + \ldots + g_{\beta_1\alpha_1}\ldots (\partial_\mu g_{\beta_n\alpha_n}) \\ &= \frac{1}{(n-1)!}\epsilon^{\beta_1\ldots\beta_n}\epsilon^{\alpha_1\ldots\alpha_n} (\partial_\mu g_{\beta_1\alpha_1})g_{\beta_2,\alpha_2}\ldots g_{\beta_n\alpha_n} \end{align*}
Now
\begin{align*} g &= \epsilon^{\alpha_1\ldots\alpha_n}g_{1\alpha_1}\ldots g_{n\alpha_n} \\ &= g_{1\alpha_1} \epsilon^{\alpha_1\ldots\alpha_n}g_{2\alpha_2}\ldots g_{n\alpha_n} \\ &= \frac{1}{(n-1)!}g_{1\alpha_1} \epsilon^{\alpha_1\ldots\alpha_n}\epsilon^{\beta_2\ldots\beta_n}g_{\beta_2\alpha_2}\ldots g_{\beta_2\alpha_n} \\ \Rightarrow 1 &= \frac{1}{g(n-1)!}g_{1\alpha_1} \epsilon^{\alpha_1\ldots\alpha_n}\epsilon^{\beta_2\ldots\beta_n}g_{\beta_2\alpha_2}\ldots g_{\beta_2\alpha_n} \end{align*}
now, I would like to somehow act with the inverse metric on both sides to get the result, but it doesn't really work: we have $1$ in the index of $g$ on the RHS, not some arbitrary index $i$. Also, the second levi-civita symbol onlt has $n-1$ idices, not $n$, as it was supposed to.
There's a much easier way to prove this. Note that for any $n \times n$ matrix $A$ we have the identity
$$ \det(A) = \frac{1}{n!} \epsilon^{\alpha_1, \alpha_2,\ldots, \alpha_n} \epsilon^{\beta_1, \beta_2, \ldots, \beta_n} \prod_{i=1}^n A_{\alpha_i \beta_i} $$ where we sum over repeated indices. Now suppose $A$ is invertible and let's write the components of $A^{-1}$ as $A^{ij}$ such that $A^{ij} A_{jk}= \delta^{i}_{k} = A_{kj} A^{ji}$. Now define the matrix $B$ with components $$ B^{ij} = \frac{1}{(n-1)!} \epsilon^{i, \alpha_2,\ldots, \alpha_n} \epsilon^{j, \beta_2, \ldots, \beta_n} \prod_{k=2}^n A_{\beta_k \alpha_k} $$ It isn't too hard to see that the components of $B$ are precisely those of the adjugate of $A$, so $\text{adj}(A)=B$. Because $\text{adj}(A) \cdot A = A \cdot \text{adj(A)} = \det(A) I$, we find that
$$ A^{ij} = \frac{1}{\det(A)(n-1)!} \epsilon^{i, \alpha_2,\ldots, \alpha_n} \epsilon^{j, \beta_2, \ldots, \beta_n} \prod_{k=2}^n A_{\beta_k \alpha_k}. $$