Let $(\Omega, \mathcal{F}, \mu)$ be a probability space. For the sake of simplicity, let $\Omega$ be $\mathbb{R}$. In what follows, the measurability always refers to Borel measurability.
Let $f \colon \mathbb{R}_+ \times \Omega \to \mathbb{R}_+$ be a function such that:
(i) For each $z \in \Omega$, the function $k \mapsto f(k,z)$ is concave, increasing, and continuously differentiable, while $z \mapsto f(k,z)$ is Borel measurable for each $k \in \mathbb{R}_+$; (ii) $\lim_{k \downarrow 0} f'(k,z) >0$ for each $z \in \Omega$. Here and below, $f'(k,z)$ denotes the partial derivative of $f$ with respect to $k$; and (iii) $f(0,z)=0$ for all $z \in \Omega$.
Let $v \colon \mathbb{R}_+ \to \mathbb{R}_+$ be a bounded, strictly concave and strictly increasing function, and be continuously differentiable on $(0, \infty)$.
Define a function $g $ by
\begin{align*} g(k) := \left( \int_{\Omega} \left[ v\left( f( k, z ) \right) \right]^\alpha \mu (\mathrm{d}z) \right)^{1/\alpha}, \qquad (0<\alpha <1). \end{align*}
Question: In fact, since $g$ is concave (it has been proved), we know that the right-hand and the left-hand derivatives of $g$ exist.
I aim to show that $g$ is differentiable on $(0, \eta)$ for any fixed constant $\eta >0$, and to show the derivative of $g$ which I conjecture is \begin{align*} g'(k) = \left( \int_{\Omega} \left[ v\left( f( k, z ) \right) \right]^\alpha \mu (\mathrm{d}z) \right)^{\frac{1}{\alpha} -1 } \int_{\Omega} \left[ v\left( f( k, z ) \right) \right]^{\alpha -1} v'\left( f( k, z ) \right) f'( k, z ) \mu( \mathrm{d} z) \end{align*} for all $0 < k < \eta$.
Here, $g'(k) = \dfrac{\mathrm{d}}{\mathrm{d} k} g(k)$, $v'\left( f( k, z ) \right) := \dfrac{\mathrm{d}}{\mathrm{d} f }v( f( k, z ))$, and $f'(k, z) := \dfrac{\partial}{\partial k} f(k,z)$.
My attempt:
The above stated derivative of $g$ is just my conjecture and I am not sure if the right-hand derivative of $g$ is equal to its left-hand derivative, thus I wish to verify it. My attempt is making use of the limit definition to show the left-hand and right-hand derivatives of $g$ are the same and equal to the above stated formula.
In fact, I even got stuck in finding the left-hand side and the right-hand side derivatives of $g$. But I thought it might suffice to show that the left-hand side derivative $g’_-(k) := \lim_{h \to 0^-} \dfrac{g(k+h)-g(k)}{h}$ is less than the conjecture formulation that stated above, and to show that the right-hand side derivative $g’_+(k) := \lim_{h \to 0^+}\dfrac{g(k+h)-g(k)}{h}$ is greater than the conjecture formulation. Then, by concavity of $g$, we have $g’_-(k) \geq g’_+(k)$ and hence, $g’_-(k)= g’_+(k)=g’(k)$ as desired. In this connection, I think the problem becomes how to establish the relation between the right-hand derivative and conjecture formula, and relation between the left-hand derivative and conjecture formula.
Could anyone give me some guidance and help me out please?
Thank you very much in advance!
Perhaps I am missing something out, but it seems to me that you can obtain your final result using Leibniz's rule as shown below.
$$ \begin{align*} g(k) := \left( \int_{\Omega} \big[ v\left( f( k, z ) \right) \big]^\alpha \mu (\mathrm{d}z) \right)^{\frac{1}{\alpha}} =I^\frac{1}{\alpha}\left(k\right)\qquad (0<\alpha <1). \end{align*} $$ where $$ I(k)=\int_{\Omega} v^\alpha \left( f( k, z ) \right) \mu (\mathrm{d}z) $$
Thus, if we apply regular chain rule we get
$$ \frac{\partial }{\partial k}\left[g\left(k\right)\right] =\frac{\partial }{\partial k}\left[I^\frac{1}{\alpha}\left(k\right)\right] =\frac{1}{\alpha}I^{\frac{1}{\alpha}-1}\left(k\right) \frac{\partial }{\partial k}\left[I \left(k\right)\right] $$
Then, using Measure theory form of Leibniz rule we write
$$ \frac{\partial }{\partial k}\left[I \left(k\right)\right] = \frac{\partial }{\partial k}\left[\int_{\Omega} \big[ v\left( f( k, z ) \right) \big]^\alpha \mu \left(\mathrm{d}z\right)\right] = \int_{\Omega} \frac{\partial }{\partial k}\big[ v\left( f( k, z ) \right) \big]^\alpha \mu\left(\mathrm{d}z\right) = \int_{\Omega} \alpha v^{\alpha-1}\left( f( k, z ) \right) v'\left( f( k, z )\right) \frac{\partial }{\partial k}\big[ f( k, z ) \big] \mu (\mathrm{d}z) = \alpha\int_{\Omega} v^{\alpha-1}\left( f \right) v'\left( f \right) f_k\left( k, z \right) \mu \left(\mathrm{d}z\right) $$
Substituting the last expression into previous formula for $g'(k)$ we get
$$ g'\left(k\right) = \frac{1}{\alpha} \left(\int_\Omega \nu^\alpha\left(f\right)\mu\left(\mathrm{d}z\right)\right)^{\frac{1}{\alpha}-1} \cdot \alpha\int_{\Omega} v^{\alpha-1}\left( f \right) v'\left( f \right) f_k\left( k, z \right) \mu \left(\mathrm{d}z\right) $$ $$ \boxed{g'\left(k\right) = \left(\int_\Omega \nu^\alpha\left(f\right)\mu\left(\mathrm{d}z\right)\right)^{\frac{1}{\alpha}-1} \int_{\Omega} v^{\alpha-1}\left( f \right) v'\left( f \right) f_k\left( k, z \right) \mu \left(\mathrm{d}z\right)} $$
where $f=f\left(k,z\right)$ and $f_k\left(k,z\right)=\dfrac{\partial f\left(k,z\right)}{\partial k}$.
Justification for Using Leibniz's rule
As pointed out in comments by the OP, I have not provided justification for using Leibniz's rule in measure-theoretical form:
Let us verify the last condition for the integrand
$$ F\left(k,z\right)=\left[ v\big( f( k, z ) \big) \right]^\alpha , \qquad 0<\alpha<1. $$
Observe that $F\left(k,z\right)$ is superposition of three concave increasing functions (treating $k\mapsto f\left(k,z\right)$ as function of single variable $k$)
$$ F\left(\cdot\right) = F_1\circ F_2\circ F_3 = F_3\left(F_2\left(F_1\left(\cdot\right)\right)\right), $$
where
$$ \begin{aligned} F_1\left(\tau\right) &= f\left(\tau,z\right), &&\text{where $z$ is fixed} \\ F_2\left(\tau\right) &= v\left(\tau\right), &&\\ F_3\left(\tau\right) &= \tau^{\alpha}, && 0<\alpha<1 \end{aligned} $$
with $\tau$ being dummy variable. As an exercise, I propose you to try to show explicitly that superposition of $F\left(\cdot\right) = F_1\circ F_2\circ F_3 = v^\alpha\left( f\left( k, z \right) \right)$ is convex in $k$.
Recall that concave function of single variable has monotonically decreasing derivative, which means that $F'\left(\tau\right)\leq F'\left(\tau_0\right)$ whenever $\tau\leq\tau_0$. For example, we can get bound
$$ \begin{aligned} F'\left(\tau\right)&<F'\left(0\right)& \text{since}& &f \colon \mathbb{R}_+ \times \Omega \to \mathbb{R}_+ \implies \tau>0 \end{aligned} $$
where $F'\left(0\right)$ does not depend on $\tau$ (and possibly depends on $\omega$).
Thus, choosing nonnegative constant $k_0$ we can bound derivative as $$ \frac{\partial}{\partial k} \Big[v^\alpha\big( f\left( k, z \right) \big)\Big] \leq \left.\frac{\partial}{\partial k} \Big[v^\alpha\big( f\left( k, z \right) \big)\Big]\right\rvert_{k_0} \qquad\text{ whenever } k\geq k_0. $$
Thus the third condition is satisfied.