I a aware that if I'm trying to solve for $x$ the problem
$y \in [\lambda I + N_X](x)$
where $y$ is a known vector, and $N_X$ is the normal cone given by
$N_X(x) = \{u : \langle u, x - y\rangle\geq0,~\forall y\in X\}$
then the solution is given in terms of the projection of $y$ onto $X$. Also if $A$ is an inversible matrix, there's a trick that makes it possible to compute the solution of $y \in [\lambda A + N_X](x)$ as well. But what about if you replace the linear operator by something like the gradient of a convex differentiable function? Namely, consider the problem of finding the solution of
$y \in [\lambda \nabla f + N_X](x)$
Is there a way to solve this for $x$, giving the solution in terms of the projection of a vector onto the set $X$?
I suppose that $X$ is convex. Then the inclusion is the necessary optimality condition of $$ \min_{x\in X} \lambda f(x) - x^Ty. $$ If $x^*$ is a solution, then it satisfies $$ (\lambda \nabla f (x^*) - y)^T(x-x^*)\ge0\quad \forall x\in X, $$ which is equivalent to $$ y-\lambda \nabla f (x^*) \in N_X(x^*), $$ which is the inclusion you are after.
The case of the projection of $y$ onto $X$ is the special case $f(x)=\frac12\|x\|^2$.