Let $f:\mathbb{R}^n\rightarrow\mathbb{R}$ and $g:\mathbb{R}^n\times\mathbb{R}^n\rightarrow\mathbb{R}$ be continuously differentiable functions, where $n\in\mathbb{N}_+$. Suppose further that there is some compact set $A\subseteq \mathbb{R}^n$ such that on $\mathbb{R}^n \setminus A$, $f$ is quasi-concave and $x\mapsto g(x,y)$ is quasi-concave for all $y$.
Denote $f_1(x)=\frac{\partial f(x)}{\partial x}$ and $g_1(x)=\frac{\partial g(x,y)}{\partial x}$.
Suppose that $x^*$ is the unique solution to the equation:
$$f_1(x^*)=g_1(x^*,x^*).$$
My question is the following: Is there a way of representing $x^*$ as the solution of a fixed-point and/or optimization problem, so as to avoid the evaluation of derivatives. (In my context, the derivatives could only be evaluated numerically.)
I imagine the answer would be something like:
$$x^*=\mathrm{\text{arg max-or-min}}_x {\left( f(x) - g(x, x^*) \right)}$$
which is an optimization problem, wrapped in a fixed point problem. Whether the operator is $\max$ or $\min$ the first order conditions imply $f_1(x^*)=g_1(x^*,x^*)$ as required, but of course the second order conditions are undue restrictions in either case, so this does not quite work.
Are there restrictions on $f$ and $g$ that would make something like this work? Assuming that $f$ and $x\mapsto g(x,y)$ are globally strictly concave (for all $y$) would be OK in context, for example.
Edit: if it helps, you may also assume that $f(x^*)=g(x^*,x^*)$, producing an over-determined system.
Edit: Beginnings of an answer:
Since $x^*$ is the unique solution to the equation above, and $f$ and $g$ are differentiable, the function: $$x\mapsto f(x)-g(x,x^*)$$ has either a unique global maximum or a unique global minimum, and has no other stationary points. Thus, it is certainly the case that: $$x^*=\begin{cases} \mathrm{\text{arg max}}_x {\left( f(x) - g(x, x^*) \right)}, & \text{if }\mathrm{\text{max}}_x {\left( f(x) - g(x, x^*) \right)}\text{ exists} \\ \mathrm{\text{arg min}}_x {\left( f(x) - g(x, x^*) \right)}, & \text{if }\mathrm{\text{min}}_x {\left( f(x) - g(x, x^*) \right)}\text{ exists} \end{cases}.$$ Given that detecting unboundedness is usually computationally easy, the fact that this requires two optimizations is unproblematic.
Now, by the continuous differentiability of $g$, there exists (??) some open set $U\subseteq\mathbb{R}^n$ with $x^*\in U$ such that for all $y\in U$, the function: $$x\mapsto f(x)-g(x,y)$$ has either a unique global maximum or a unique global minimum, and has no other stationary points. Thus, we may define the function $h:U\rightarrow\mathbb{R}^n$ by: $$h(y)=\begin{cases} \mathrm{\text{arg max}}_x {\left( f(x) - g(x, y) \right)}, & \text{if }\mathrm{\text{max}}_x {\left( f(x) - g(x, y) \right)}\text{ exists} \\ \mathrm{\text{arg min}}_x {\left( f(x) - g(x, y) \right)}, & \text{if }\mathrm{\text{min}}_x {\left( f(x) - g(x, y) \right)}\text{ exists} \end{cases}.$$ $x^*$ is then the unique fixed point of $h$. Under further conditions on $f$ and $g$ (?? derived from the Banach fixed point theorem ??), $x^*$ may then be found by fixed point iteration.
If $f$ is strictly concave and $g$ is strictly convex in $x$ (for all $y$), then the sum $$f(x)-g(x,y)$$ is strictly concave in $x$ (since $-g$ is concave if $g$ is convex, and the sum of concave functions is concave). Thus, $x^*$ fulfilling $$f_1(x^*)=g_1(x^*,y)$$ would be the solution to the maximization problem $$\max_x f(x)-g(x,y)$$ for a given $y$, since the maximization problem yields the first order condition $0=f_1(x^*)-g_1(x^*,y)$, which is similar but not equivalent to your condition.
Similarly, you can flip concavity/convexity of $f/g$: If $f$ is strictly convex and $g$ is strictly concave in $x$ (for given $y$), then the maximization problem $$\max_x -f(x)+g(x,y),$$ is again strictly concave, so the first order condition $f_1(x^*)=g_1(x^*,y)$ is necessary and sufficient for the maximum.
Finally, you can phrase both of these as minimization problems, just flip the signs in front of the $f$ and $g$ functions.
EDIT: In order to match your condition exactly, so that $y=x^*$, you indeed need to look at the maximization problems $$\max_x f(x)-g(x,y=x^*)$$ with $f$ being strictly concave and $g$ being strictly convex; however, this might not be attractive since you need to fix $y=x^*$ before you computed $x^*$.
In response to your first comment, if both $f$ and $g$ are concave in $x$, then you cannot establish your condition as a result of an optimization problem without further assumptions, because you need different signs in front of $f$ and $g$ (so if both functions are concave, then flipping the sign on one means this is convex, but the sum of a convex and a concave function is neither necessarily concave nor convex).
One additional assumption, informally, would be that either $$f(x)-g(x,y)$$ is concave for all $x$ and $y$, which is not implied by $f$ and $g$ being concave, or that $$-f(x)+g(x,y)$$ is concave, so that the first order condition is necessary and sufficient for the maximum. Thus, you could allow $-f$ or $-g$ to be convex as long as the other term is "much more concave" so that their sum is concave. If the functions are twice differentiable, this boils down to assuming $$f_{xx}(x)-g_{xx}(x,y)<0\text{ or }-f_{xx}(x)+g_{xx}(x,y)<0$$ for all $x$ and $y$. Then use the above formulation with fixing $y=x^*$.