I'm currently working on the regularized gamma function: $$ \frac{\gamma(x, f(x, c))}{\Gamma(x)} = c $$
Here, $c\in(0, 1)$ is a constant, and $f(x, c)$ is a function of $x$ and $c$ that satisfies the equation. My primary focus is on understanding the asymptotic behavior of a related expression: $$ \frac{\gamma(1+x, f(x, c))}{\Gamma(1+x)} $$
as $x$ approaches 0.
Will this expression converge to a specific constant as $x$ approaches 0? I attempted using $\gamma(1+ x, y) = x\cdot \gamma(x, y) -y^x e^{-y}$ (which is by integration by parts) and converted the expression into the following: $$ \frac{\gamma(1+x, f(x, c))}{\Gamma(1+x)} = c - \frac{f(x, c)^x \exp(-f(x,c)) }{\Gamma(1+x)}. $$
The expression is lower than $c$, and I suspect that this will converge to 0 as $x\rightarrow 0$.
Thank you in advance.
Fix $0<c<1$. By Lemmas $1.1$ and $2.1$ of this paper, we have $$ \lim_{x\to 0^+}f(x,c)=0 \quad\text{ and }\quad \lim_{x\to 0^+}(f(x,c))^x = c. $$ Thus, $$ \mathop {\lim }\limits_{x \to 0^ + } \frac{{(f(x,c))^x \exp ( - f(x,c))}}{{\Gamma (1 + x)}} = c. $$ Therefore, your original expression does indeed converge to $0$.