Let $f(x,y)$ be a continuously differentiable function from $\mathbb{R}^2$ to $\mathbb{R}$. Suppose that for every $y$ the function $g_y(x)=f(x,y)$ is strictly convex. Define $$ h(y) = \arg\min_{x\leq b} f(x,y) $$ where $b\in \mathbb{R}$ is a parameter that constraint the set of possible $x$.
My question is: Can we say anything about the derivative of the argmin $h'(y)$?
I know from this question that $h(y)$ is continuous. In addition, the answer to this question shows that when the problem is unconstrained ($b=\infty$) we can differentiate the optimality condition $$ \frac{ \partial f(h(y),y)}{\partial x} = 0 $$ with respect to $y$ to obtain an expression for $h'(y)$. But when the problem is constrained ($b<\infty$) that expression becomes $$ \frac{ \partial f(h(y),y)}{\partial x} = \lambda(y) $$ where $\lambda(y)$ is the Lagrange multiplier on the constraint $x\leq b$ and its hard to say anything about $h'(y)$ without knowing $\lambda'(y)$.
The specific case of $x$ and $y$ both being one-dimensional real variables is simpler than the multidimensional picture. In particular, there is no need for Lagrange multiplier, as it is often easiest to reason without them. (Lagrange multipliers are formally valid in 1D but useless. If you want more details on why this is so, please ask a separate question unless there is one already.)
Let $h_b$ be the function you defined, now making the parameter $b$ explicit.
The argument I gave in response to an earlier question shows that if $f$ is twice continuously differentiable, the minimum is unique, and $\partial_1^2f(h_\infty(y),y)>0$ for all $y$, then $h_\infty$ is continuously differentiable with $$ h_\infty'(y) = -\frac{\partial_2\partial_1f(h_\infty(y),y)}{\partial_1^2f(h_\infty(y),y)}. $$ This was for the unconstrained problem.
If $h_\infty(y)<b$, then $h_\infty(y)=h_b(y)$. By continuity if this is valid at a point $y$, it will be valid in a neighborhood. In this case $h_\infty'(y)=h_b'(y)$. If the minimizer is in the interior of the domain, it all behaves as if there was no constraint. This is a typical feature.
However, bear in mind that the formula for the unconstrained case required a twice continuously differentiable function to start with. With only one derivative I foresee trouble, and things can go awry.
If you are at the boundary (again assuming a global minimum for all $y$), then there are two possible cases:
$\partial_1f(h_b(y),y)<0$
$\partial_1f(h_b(y),y)=0$
The case with $>$ is impossible, as then you could never be at a minimum at the endpoint of the interval. In case 1 the derivative of the condition will continue to be negative in a small neighborhood, and the minimizer will stay at the boundary when you vary $y$ a little. Thus $h_b(y)=b$ for $y$ in some open interval and so $h_b'=0$ at the point.
In case 2 differentiability can fail. Take, for example, $f(x,y)=(x-y)^2$, for which $$ h_b(y) = \begin{cases} y, & y \leq b\\ b, & y \geq b. \end{cases} $$ This function is continuous but not differentiable across the point $y=b$ where you have the "transition type" between interior minimizers and stable boundary minimizers (case 1).