Given a countable sample space $\Omega$ and a (discrete) random variable $X:\Omega\to\mathbb{R}$, let $f_X:\mathbb{R}\to [0,1]$ be a probability function of $X$, that is, $f_X(x):=P(\{s\in\Omega\mid X(s)=x\})$ where $P$ is a probability distribution. We define the expected value of $X$ by $E(X):=\Sigma_{x} xf_X(x)$.
If $g:\mathbb{R}\to\mathbb{R}$ is a function with $im(X)\subseteq dom(g)$, then clearly $g(X):\Omega\to\mathbb{R}$ is also a random variable. But I don't understand why $E(g(X))=\Sigma_{x}g(x)f_X(x)$?
$E(g(X))=\Sigma_{x}xf_{g(X)}(x) = \Sigma_{x}xP(\{s\in\Omega\mid g(X(s))=x\})$, but how do I progress from here?
Should I perhaps define the probability distribution as a function $P:\mathbb{P}(\Omega)\to[0,1]$ satisfying the probability axioms and attempt to work something out from there?
Continuing where you left off, $$ \Sigma_{x\in\text{Im}(g)}xP(\{s\in\Omega\mid g(X(s))=x\})=\Sigma_{x\in\text{Im}(g)} xf_X(g^{-1}(x))=\Sigma_{x\in\text{Im}(X)}g(x)f_X(x). $$ The last equality follows since as $x$ ranges over the image of $g$, $g^{-1}(x)=\{u\in \text{Im}(X):g(u)=x\}$ partitions the image of $X.$