I have a discrete distribution over $K$ actions given by probabilities $p_1, p_2 \ldots p_K$ such that $\sum\limits_{i \in [K]} p_i = c$ (where we think of $c$ as in integer in [1, K] and for each action $i$, $p_i \in [0,1]$). Expectation in normal sense is defined as: $$ \mathbb{E}[f(X)] = \sum\limits_{i} Pr(X=i) * f(i) $$
An intuitive explanation of the above expression is that it is the expected value of the function when a single action is chosen according to the distribution. In my current problem, however, (i.e. when the probabilities satisfy $\sum\limits_{i \in [K]} p_i = c$, expected number of actions chosen is $c$) I am not able to make sense of the resulting expectation expression. What do $f(X)$ and $\mathbb{E}[f(X)]$ in this context mean?
EDIT: By extending ideas from a "correctly defined" distribution, I am currently thinking this: If one were to pick $c$ actions according to these probabilities then the expected sum of $f$ for the actions chosen is $\mathbb{E}[f(X)]$.
In the general case, expectations are integrals, that is they "measure volume". In the probability case (so not your case), you consider an universum of all possible outcomes, and you suppose this whole world of outcome has total volume $1.$ When you measure the probability of an event, you actually measure the volume of a subspace of this universum, that is the "place" your event is taking in a whole universe of possibilities. An event which is not possible isn't even part of the universe of possibilities, so it has $0$ volume (probability). The event that something happens in the possibilities is the whole universe by definition (something MUST happen), so its volume (probability) is $1.$ The higher the size of your event is in the universe of all possibilities, the more probable it will be.
Now, the expectation (in a probability sense, discrete case), is just a weighted sum (average) of the outcomes, where the weights are the "volume" of this outcome in the world of possibilities. The assumption that the whole universe of possibilities has size 1 helps you understand probabilities as proportions like 50%, 10%, etc, and allows us to understand expectations as averages.
Now, you just suppose your universe of possibilities has size $c$ rather than 1. Then an expectation is the same: a weighted sum of outcomes, where the weights are the size of the outcome. This is just not a weighted average anymore. If you were to compute what proportion a particular outcome has in the world of possibilities, you would divide the size of the outcome by the size of the universe, that is you would divide it by $c.$ And by doing this, you would obtain a probability measure again.
Sorry if this was unclear, but the mathematics behind this are probably less clear.