Let $R(s)$ be the inverse of a differentiable CDF $F:[0,1] \rightarrow [0,1]$. Therefore $R()$ is an increasing function, $R:[0,1]\rightarrow [0,1].$
[Aside: I earlier mistakenly thought if $F$ is differentiable $R$ must be differentiable. But this need not be the case, as pointed out by copper.hat in the comments using the counterexample $F(x)=x^2$, so that $R(x)=\sqrt{x}$ which is not differentiable at $x=0$.]
Let $\alpha \in [0,1)$. We are looking for a solution $\beta(\alpha)$ to the following functional equation:
$$ \int\limits^{\beta(\alpha)}_{\alpha} R(s)ds=\left(\frac{\beta(\alpha)-\alpha}{1-\alpha}\right) \int\limits^{1}_{\beta(\alpha)} R(s)ds $$
such that $\beta(\alpha)>\alpha$. (i.e. we are not interested in the trivial solution $\beta(\alpha)=\alpha$.)
My question is: Is $\beta(\alpha)$ always differentiable? If yes, how can we show it? If not, what additional conditions do we need on $F$ so it is?
Define $\phi(\cdot)$ as: $\phi(x)=\int\limits^x_0R(s)ds$ for all $x\in[0,1]$.
Note that a unique solution $\beta(\alpha) \in (\alpha,1)$ always exists. This can be seen from writing the previous equation as:
$$ \frac{\phi(\beta)-\phi(\alpha)}{\phi(1)-\phi(\beta)} = \frac{\beta - \alpha}{1-\alpha} $$
and noting that the LHS is a convex increasing function of $\beta$ for a given $\alpha$ and then applying standard intermediate value theorem arguments to the LHS and RHS which are both continuous functions of $\beta$ for a given $\alpha$.
(Edit) My attempt: Define $W:[0,1] \rightarrow [0,1]$, $W(x)=\phi(1)-\phi(x)$.
The original equation can be written as:
$$ \int\limits_{\alpha}^{\beta}\left((1-\alpha)R(s)-W(\beta)\right)ds=0 $$
which gives us, $\left((1-\alpha)R(\alpha)-W(\beta)\right)<0$ and $\left((1-\alpha)R(\beta)-W(\beta)\right)>0$.
We can differentiate both sides of the original equation using fundamental theorem of calculus and get:
$$ ((1-\alpha)R(\beta)-(W(\beta)-(\beta-\alpha)R(\beta)))\beta'(\alpha)= ((1-\alpha)R(\alpha)-\left(\frac{1-\beta}{1-\alpha}\right)W(\beta)) $$
The inequalities derived above show that $((1-\alpha)R(\beta)-(W(\beta)-(\beta-\alpha)R(\beta)))>0$, so $\beta'(\alpha)$ exists.
Is this a valid proof of the existence of $\beta'(\alpha)$ always? I'm confused because we (probably) need to know $\beta'(\alpha)$ exists before taking derivative of the original equation, and not the other way round? Any help is appreciated.
Write your equation above in the form $G(\alpha,\beta)=0$ and note that $G$ is $C^1$ in a neighborhood of your solution (indeed: $\phi$ is $C^1$ because $R$ is continuous as an inverse of an injective continuous function on an interval).
The implicit function theorem states that a sufficient (and I guess in your case also neccesary) condition for the differentiability of the solution $\beta$ at a point $\alpha_0$ is that $\frac{dG}{d\beta}(\alpha_0,\beta(\alpha_0))\ne0$.
Thus, the (simple) exercise is to find $G$, and differentiate it with respect to the variable $\beta$. The more difficult part is to check whether this derivative can be zero on the graph of your solution. If not, your solution is even continuously differentiable.