Let $X$ be a metric space and $\{g_n \}_{n \geq 1}$ be a sequence of complex valued functions defined on $X.$ Suppose that $\sum\limits_{n = 1}^{\infty} g_n(x)$ converges absolutely and uniformly for $x$ in $X.$ Then show that there exists $n_0 \in \mathbb N$ such that $\sum\limits_{n = n_0 + 1}^{\infty} \log (1 + g_n(x))$ converges absolutely and uniformly for $x$ in $X.$
$\textbf {My Attempt} :$ Since $\sum\limits_{n=1}^{\infty} g_n (x)$ converges uniformly for $x$ in $X$ there exists $n_0 \in \mathbb N$ such that for all $n \gt n_0$ and for all $x \in X$ we have $|g_n (x)| \lt \frac {1} {2}$ so that $\text {Re}\ (1 + g_n (x)) \gt \frac {1} {2} \gt 0.$ We know that for any $z \in \mathbb C$ with $|z| \lt \frac {1} {2},$ $|\log (1 + z)| \leq \frac {3} {2} |z|.$ Using this inequality we have $|\log (1 + g_n(x))| \leq \frac {3} {2} |g_n(x)|,$ for all $x \in X$ and for all $n \gt n_0.$ By comparison test we can conclude that $\sum\limits_{n = n_0+1}^{\infty} \log (1 + g_n(x))$ converges absolutely for $x$ in $X.$
But I find it difficult to show the uniform convergence of the above series. I am trying to show that the sequence of partial sums of the series $\sum\limits_{n = n_0+1}^{\infty} \log (1 + g_n(x))$ is uniformly Cauchy in $X.$ But for that we need uniform Cauchyness of the sequence of partial sums of the series $\sum\limits_{n = 1}^{\infty} |g_n (x)|.$ In other words we need the series $\sum\limits_{n=1}^{\infty} |g_n (x)|$ to converge uniformly on $X.$ But it is only given that $\sum\limits_{n = 1}^{\infty} g_n(x)$ converges uniformly on $X.$ From this given condition can we somehow deduce the uniform convergence of the series $\sum\limits_{n = 1}^{\infty} |g_n(x)|$ on $X\ $?
Any suggestion in this regard would be warmly appreciated. Thanks for investing your valuable time in reading my question.
EDIT $:$ I think that it's not true in general. Simply take $g_n(x) = \frac {(-1)^n} {n},$ for all $x \in X.$ Then $\sum\limits_{n=1}^{\infty} g_n(x)$ converges uniformly on $X,$ by Leibnitz test although $\sum\limits_{n=1}^{\infty} |g_n(x)| = \sum\limits_{n=1}^{\infty} \frac {1} {n}$ doesn't even converge.
The hypotheses are that $\sum g_n(x)$ converges uniformly and absolutely. As a counterexample to the uniform convergence of $\sum |g_n(x)|$, choosing $g_n(x) = \frac{(-1)^n}{n}$ falls short since the series does not satisfy the second hypothesis of absolute convergence.
Nevertheless, it is true that uniform and absolute convergence of $\sum g_n(x)$ does not imply uniform convergence of $\sum |g_n(x)|$. A counterexample is given by $g_n(x) = \frac{(-1)^n x^n}{n}$ with domain $[0,1)$. Here we have uniform convergence of $\sum_{n=1}^\infty \frac{(-1)^nx^n}{n}$ by Abel's test since the sequence $(x^n)$ is monotone and bounded for all $x \in [0,1)$ and the series $\sum_{n=1}^\infty\frac{(-1)^n}{n}$ is convergent by the AST and, hence, uniformly convergent since there is no dependence on $x$. We also have the absolute convergence of $\sum_{n=1}^\infty \frac{x^n}{n}$ by comparison with the convergent geometric series. However the series $\sum_{n=1}^\infty \frac{x^n}{n}$ is not uniformly convergent for $x \in [0,1)$ since,
$$\sup_{x \in [0,1)} \left|\sum_{k=n+1}^\infty \frac{x^k}{k}\right| \geqslant\sup_{x \in [0,1)}\sum_{k=n+1}^{2n} \frac{x^k}{k} \geqslant \sup_{x \in [0,1)}n \cdot \frac{x^{2n}}{2n} \geqslant \frac{1}{2}\left(1 - \frac{1}{n}\right)^{2n} \underset{n \to \infty}\longrightarrow \frac{e^{-2}}{2} \neq 0$$
It is also true that uniform convergence of $\sum |g_n(x)|$ implies uniform convergence of both $\sum \log (1 + g_n(x))$ and $\sum |\log (1 + g_n(x))|$ as you are showing. This proof arises commonly in the discussion of uniform convergence of infinite products $\prod (1 + g_n(x))$ and the result is an analogy to the Weierstrass M-test for infinite series.
That leaves the original question of whether or not (conditional) uniform convergence of $\sum g_n(x)$ is enough to imply uniform convergence of $\sum \log (1 + g_n(x))$. If you can show that $\sum_{n=2}^\infty \log ( 1+ (-1)^nx^n/n)$ fails to converge uniformly on $[0,1)$, then you have a counterexample, but I have not checked that yet.