In information theory sometimes a problem that is encountered is how to discretize / quantize samples into bins. Usually bins are considered discrete sets, in other words we only have in or outside of the set.
What worries me a bit is that corresponding family of functions are piecewise constant and therefore very un-smooth and unsuitable for calculus methods involving the chain rule.
Can we despite these facts find some suitable smooth family of functions to place in a back propagation network to help an algorithm select bin placement?
Own work: I have considered the family of functions:
$$\exp\left(-\frac{|(x-\mu)|^{n/2}}{\sigma^{n/2}}\right),\,\,\, n\in \mathbb Z^+$$
Because they are easy to calculate and are very differentiable at every point (except sometimes at $0$). For $n=2$ we get the famous Laplaceian distribution and $n=4$ the Gaussian. Would these be easy to work with in a back-propagating framework? In other words, to throw out a few of them with random $\mu$ and $\sigma$ and then error-backpropagate updates for $\mu$ and $\sigma$ for a set of known $x$?
