Let $X$ be a discrete random variable, and $f(x)$ its pmf. Assume that $g(X)$ is a random variable. Then the law of unconscious statistician says $$E[g(X)]=\sum_xg(x)f(x).$$ Should I interpret this as follows?
$E[g(X)]$ exists if and only if the sum on the right exists, and in this case they are equal.
By the "sum," I don't mean the limit of partial sums, but the difference of positive and negative parts like in Lebesgue integral: $$\sum_xg(x)f(x)=\sum_{g(x)>0}g(x)f(x)+\sum_{g(x)<0}g(x)f(x)$$
Yes. Let $Y=g(X)$ and let $\mathcal{X}$ and $\mathcal{Y}$ be the discrete sets of all values that $X$ and $Y$ can take, respectively: \begin{align} \mathcal{X} &= \{X(\omega) \in \mathbb{R}: \omega \in \Omega\} \\ \mathcal{Y} &= \{g(X(\omega)) \in \mathbb{R}: \omega \in \Omega \} \end{align} Then \begin{align} \sum_{y \in \mathcal{Y}: y\geq0} yP[g(X)=y] &= \sum_{y \in \mathcal{Y} : y\geq0}y \sum_{x \in \mathcal{X}:g(x)=y} P[X=x] \\ &= \sum_{y \in \mathcal{Y}:y\geq 0} \sum_{x \in \mathcal{X}:g(x)=y} g(x) P[X=x] \\ &= \sum_{x \in \mathcal{X}:g(x)\geq 0} g(x)P[X=x] \end{align} Similarly, $$ \sum_{y \in \mathcal{Y} : y<0} y P[g(X)=y] = \sum_{x \in \mathcal{X}:g(x)<0} g(x) P[X=x]$$ So $E[Y]$ exists and is finite iff the left-hand-sides are both finite iff the right-hand-sides are both finite.