Let $X$ be a normed space. Then $X$ is a Banach space if and only if the absolute convergence of any series in $X$ implies the conditional convergence of that series.
Is there any name given to the above result in the standard literature on the normed space theory?
And, is there any name given to the property of absolute convergence of a series implying conditional convergence?
How do we prove this theorem?
My effort:
Suppose that $X$ is a Banach space. Let $\sum_n x_n$ be an absolutely convergent series in $X$. Then the sequence $(\alpha_n)_{n\in \mathbb{N}}$, where $$\alpha_n \colon= \Vert x_1 \Vert + \cdots + \Vert x_n \Vert \ \mbox{ for all } \ n \in \mathbb{N},$$ is a Cauchy sequence in $\mathbb{R}$.
Thus, given a real number $\epsilon>0$, we can find a natural number $N$ such that $$\vert \alpha_m - \alpha_n \vert < \epsilon \ \mbox{ for all } \ m, n \in \mathbb{N} \ \mbox{ such that } \ m > N \ \mbox{ and } \ n > N. $$ Now let $m, n \in \mathbb{N}$ such that $n > m > N$. Then $$ \begin{align} \left\Vert \sum_{k=1}^n x_k - \sum_{k=1}^m x_k\right\Vert &= \left\Vert \sum_{k=m+1}^n x_k \right\Vert \\ &\leq \sum_{k=m+1}^n \Vert x_k \Vert \\ &= \alpha_n - \alpha_m \\ &= \vert \alpha_n - \alpha_m \vert \\ &< \epsilon. \end{align} $$ Thus the sequence $\left(\sum_{k=1}^n x_k \right)_{n\in\mathbb{N}}$ of partial sums of the series $\sum_n x_n$ is Cauchy and hence convergent.
Conversely, suppose that the absolute convergence of any series in $X$ implies convergence of that series. Suppose that $(x_n)$ is a Cauchy sequence in $X$. We need to show that this sequence converges in $X$. How to?
I don't know of a name for this theorem, or the property you stated. If you are looking for a reference, this is Theorem $5.1$ of Folland's Real Analysis (Modern Techniques and Their Applications). Your proof of the first implication is fine. As for the second, I have adapted Folland's proof below (it's tricky).
As $(x_n)$ is a Cauchy sequence, for each $j \in \mathbb{N}$, we can find $n_j$ such that $\|x_n - x_m\| < 2^{-j}$ for all $n, m \geq n_j$. Furthermore, we can arrange that $n_1 < n_2 < \dots$ Now consider the sequence $(y_j)$ in $X$ given by $y_1 = x_{n_1}$, and $y_j = x_{n_j} - x_{n_{j-1}}$ for $j > 1$. Note that partial sum $\sum_{j=1}^ky_j$ is telescoping and simplifies to $x_{n_k}$, while
$$\sum_{j=1}^{\infty}\|y_j\| = \|y_1\| + \sum_{j=2}^{\infty}\|x_{n_j}-x_{n_{j-1}}\| \leq \|y_1\| + \sum_{j=2}^{\infty}2^{-(j-1)} = \|y_1\| + \sum_{j=1}^{\infty}2^{-j} = \|y_1\| + 1.$$
As the absolute series converges, $\displaystyle\sum_{j=1}^{\infty}y_j$ converges by hypothesis. But
$$\sum_{j=1}^{\infty}y_j = \lim_{k \to\infty}\sum_{j=1}^k y_j = \lim_{k\to\infty}x_{n_k}$$
so $(x_{n_k})$ converges. As $(x_n)$ is a Cauchy sequence with a convergent subsequence $(x_{n_k})$, $(x_n)$ converges (see here for example).