This question arose when I saw that some people define a discrete probability distribution using the probability function.
Let $p: \mathbb{R} \to [0,1]$ be a probability function. I.e., $S:=\{p \neq 0\}$ is countable and $\sum_{x \in S} p(x) = 1$. Is it true that there exists a probability measure $\mu: \mathcal{R} \to [0,1]$ such that $\mu(\{x\}) = p(x)?$
I think the answer is yes.
Define $\mu: \mathcal{R} \to [0,1]: A \mapsto \sum_{x \in A \cap S} p(x)$. The function is well defined by absolute convergence of the series (and hence commutative convergence)
Then, $\mu(\mathbb{R}) = \sum_{x \in S} p(x) = 1$
and if $(A_n)_{n\geq1}$ is a sequence of disjoint sets, then:
$$\mu\left(\bigcup_{n=1}^\infty A_n\right) = \sum_{x \in S \cap\bigcup A_n} p(x) = \sum_{x \in \bigcup (A_n \cap S)} p(x) = \sum_{n=1}^\infty \sum_{x \in A_n\cap S}p(x) = \sum_{n=1}^\infty \mu(A_n)$$
and hence $\mu$ is a probability measure that satisfies the condition.
Is this correct? As a follow up question, is a probability measure $\mu$ satisfying the condition I wrote down unique?
You are right and uniqueness holds. To see this, let $\mu: \mathcal{R} \to [0,1]$ be a probability measure such that $\mu(\{x\}) = p(x)$. Then, for every $A \in \mathcal{R}$, we have $$ \mu (A) = \mu ((A \cap S) \cup (A \cap S^{c})) = \mu ((A \cap S)) + \mu (A \cap S^{c}) = \mu ((A \cap S)) + 0 = \mu( \bigcup_{x \in A \cap S} \{x\}) = \sum_{x \in A \cap S} \mu(\{x\}) $$
In this chain of equations, we have used the fact that probability measures are $\sigma$- additive (and therefore also finitely additive) two times.