The existence of such function $g$ is guaranteed for by the well-known
Theorem. (implicit function) Let $\Omega$ be an open subset of $\mathbb R^{n+1}$, and let $f : \Omega \to \mathbb R$ have $\mathcal C^2$ regularity. Let the $n$-dimensional variable be denoted with $x$, and let the leftover variable be denoted with $y$. Let then $(x_0, y_0)$ be a point in $\Omega$ such that
$f$ vanishes at $(x_0,y_0)$;
$\partial_y f(x_0,y_0) \neq 0$.
Then there exist two open sets $\Omega' \ni x_0$ and $\Omega'' \ni y_0$ such that
First of all, $\Omega' \times \Omega'' \subseteq \Omega$;
For all $x \in \Omega'$ there exists one and only one $y \in \Omega''$ such that $f(x,y) = 0$;
The function $g : \Omega' \to \Omega''$ that assigns to every $x$ such $y$ is also $\mathcal C^2$.
First derivatives. Seen the level of regularity that $g$ is proved to achieve, it is natural to ask how the first partials of $g$ would be expressed in terms of the partials of $f$. That can be done by noticing that, for all $x \in \Omega'$, by construction, $$f(x,g(x)) = 0 \tag{1}$$ Thus, if we define the function $\mathbf s : \Omega' \to \Omega$ such that $\mathbf s(x) = (x,g(x))$, we can use the chain rule to obtain $$D(f \circ \mathbf s)(x) = Df(\mathbf s(x)) D\mathbf s(x) = \mathbf 0 \tag{2}$$ where $D\phi$ indicates the Jacobian matrix of a function $\phi$ and $\mathbf 0$ represents the $1 \times n$ identically $0$ matrix. Let us call $D_x f$ the restriction of the Jacobian of $f$ (which is a row vector) to the first $n$ variables. We also see trivially that $$D\mathbf s(x) = \begin{bmatrix} & & \\ & \mathrm{Id}_n & \\ & & \\ \hline \partial_{1}g(x) & \cdots & \partial_{n} g(x) \end{bmatrix} $$ (notice we abbreviate the first $n$ partial derivative symbols to $\partial_1, \dots \partial_n$) so that equation $(2)$ becomes $$D_x f(\mathbf s(x)) + \partial_y f(\mathbf s(x))Dg(x) = \mathbf 0 $$ We can solve for $Dg$ to find $$Dg(x) = - \frac{D_x f(\mathbf s(x))}{\partial_y f(\mathbf s(x))} = - \frac{D_x f(x,g(x))}{\partial_y f(x,g(x))} \tag{3a}$$ which makes sense due to the second hypothesis in the theorem statement and the continuity of $\partial_y f$. In terms of single partial derivatives, we have $$\partial_i g (x) = - \frac{\partial_i f(x,g(x))}{\partial_y f(x,g(x))} \tag{3b}$$
Second derivatives. The truly hard problem, though, is obtaining an expression for the second partial derivatives of $g$ (both mixed and pure). I've defined the function $\psi : \Omega' \times \Omega'' \to \mathbb R$ such that $$\psi(x,y) \doteq -\frac{1}{\partial_y f(x,y)} $$ so that $$D\psi = - \frac{D\left(\partial_y f\right)}{\left(\partial_y f\right)^2} = - \frac{1}{\left(\partial_y f\right)^2} \begin{bmatrix} \partial_1 \partial_y f & \cdots & \partial_n \partial_y f & \partial_y^2 f\end{bmatrix}$$
This way, I can rewrite $Dg$ as $$Dg : \Omega' \to \mathbb R \qquad Dg = (\psi D_x f) \circ \mathbf s $$ so that, by the chain rule and the product rule, $$D^2 g(x) = D(\psi D_x f)(\mathbf s(x)) D\mathbf s(x) = [\psi D(D_x f) + D_x f D\psi ](\mathbf s(x))D\mathbf s(x) $$
This expression "simplifies" to $$\begin{split} D^2g(x) &= \left(- \frac{1}{(\partial_y f)^2}\right)\Bigg(\partial_y f \begin{bmatrix} \partial_1^2f & \cdots & \partial_1\partial_n f \\ \vdots & \ddots & \vdots \\ \partial_n \partial_1 f & \cdots & \partial_n^2 f\end{bmatrix} + \partial_y f \begin{bmatrix} \partial_1 g \partial_1 \partial_y f & \cdots & \partial_n g \partial_1 \partial_y f \\ \vdots & \ddots &\vdots \\ \partial_1 g \partial_n \partial_y f & \cdots & \partial_n g \partial_n \partial_y f \end{bmatrix} \\ &\quad+ \begin{bmatrix} \partial_1 f \partial_1 \partial_y f & \cdots & \partial_n f \partial_1 \partial_y f \\ \vdots & \ddots &\vdots \\ \partial_1 f \partial_n \partial_y f & \cdots & \partial_n f \partial_n \partial_y f \end{bmatrix} + \partial_y^2 f \begin{bmatrix} \partial_1 g\partial_1 f & \cdots & \partial_n g\partial_1 f \\ \vdots & \ddots & \vdots \\ \partial_1 g\partial_n f & \cdots & \partial_n g \partial_n f \end{bmatrix} \Bigg) \end{split}$$ where the whole left-hand side is evaluated at $\mathbf s(x)$.
Is this expression correct? If so, can it be further simplified?
it looks fine to me. A slightly simpler way to do it. Starting from $f(x,g(x))=0$, differentiate with respect to $x_i$ as you did and get $$\partial_i f(x,g(x))+\partial_y f(x,g(x)) \partial_i g(x)=0.$$ Now differentiate with respect to $x_j$ to get $$\partial_j\partial_i f(x,g(x))+\partial_y\partial_i f(x,g(x))\partial_jg(x)+\partial_j\partial_y f(x,g(x)) \partial_i g(x)+\partial_y^2 f(x,g(x)) \partial_i g(x)\partial_j g(x)+\partial_y f(x,g(x)) \partial_j\partial_i g(x)=0.$$ Hence,$$-\frac1{\partial_y f(x,g(x))}\left[\partial_j\partial_i f(x,g(x))+\partial_y\partial_i f(x,g(x))\partial_jg(x)+\partial_j\partial_y f(x,g(x)) \partial_i g(x)+\partial_y^2 f(x,g(x)) \partial_i g(x)\partial_j g(x)\right]= \partial_j\partial_i g(x).$$ It's simpler to use the product rule than the quotient rule.