Modern neural networks oftentimes apply the softmax operation to their output neurons which is defined by
$$s(x)_i = \frac{e^{x_i}}{\sum_{j=1}^N e^{x_j}}$$
Here, $i$ is the index of an output neuron, and $N$ represents the total number of output neurons. That means that the sum of all output activations sum up to 1 and that the value of each output neuron lies between 0 and 1:
$$\sum_{j=1}^N s(x)_i = 1$$ $$0 \leq s(x)_i \leq 1$$
Given the information above and that the network's internal process contains some kind of randomness (dropout, random noise), is it possible to derive an upper bound of an output neuron's variance if you would predict $n$ times?