Given a convex $L$-smooth function $g: \mathbb{R}^d \rightarrow \mathbb{R}$ and a differentiable convex function $h: \mathbb{R}^d \rightarrow \mathbb{R}$
I want to find the optimality condition of the following algorithm $$x_k = \arg \min_{x \in \mathbb{R}^d} \frac{L}{2} \|x - x_{k-1}\|^2 + \nabla g^T(x_{k-1})(x - x_{k-1}) + h(x)$$
I've just differentiated the function and got the following condition $$x_k = x_{k-1} - \frac{1}{L}[\nabla g^T(x_{k-1}) + \nabla h(x_k)]$$
My question Is it possible to have $x_k$ on one side and $\nabla h(x_k)$ at the other one? Is there a way to get an explicit condition for $x_k$?