(Kreyszig's functional analysis book) Consider a space of all complex sequences and the distance function:
$d(X, Y) = \sum_{j=1}^{\infty} \frac{1}{2^j} \frac{|x_j-y_j|}{1+|x_j-y_j|}$.
To prove that this satisfies triangle inequality, the author proves first that ($Z = (z_j)$ is another sequence):
$\frac{|x_j-y_j|}{1+|x_j-y_j|} \leq \frac{|x_j-z_j|}{1+|x_j-z_j|} + \frac{|z_j-y_j|}{1+|z_j-y_j|}$.
Then just multiply both sides by $1/2^j$ and sum $j$ from $1$ to $\infty$ to get $d(X,Y) \leq d(X,Z) + d(Z,Y)$.
Later on there's an exercise to prove that if we replace $1/2^j$ in the above function by $\mu_j > 0$ such that $\sum_{j=1}^{\infty}\mu_j$ converges, then the new distance function is still a metric.
Could someone please explain why exactly we require convergence of $\sum_{j=1}^{\infty}\mu_j$ instead of simply $(\mu_j)$? I have a vague idea that the convergence of the individual series, or their individual sums, is required for the sum of the product of the two series to be convergent. But I don't know exactly which result or theorem states something like that.
The closest result that I can think of is the Holder inequality, but it requires the 2 sequences to belong to $l^p$ and $l^q$ where $pq=p+q$. As far as I see, $(\mu_j) \in l^1$ and that leads to a dead end.
Take two sequences $X=(1,0,1,0,1,0,...)$ and $Y=(0,1,0,1,0,1,...)$. Then defining $d(X,Y) = \sum_{j=1}^{\infty} \mu_j \frac{|x_j-y_j|}{1+|x_j-y_j|}$ leads to $d(X,Y)=\sum_{j=1}^{\infty} \frac{1}{2}\mu_j=\frac{1}{2}\sum_{j=1}^{\infty} \mu_j$ which converges only if $\sum_{j=1}^{\infty} \mu_j$ converges. This is stronger that converging $\lbrace\mu_j\rbrace$ itself.