I need a function f to map an angular value (Hue, from color spaces such as HSL and HSB, often given in degrees in [0,360]) as an input to a neural network that requires inputs in, say [-1,1].
For a linear-ish function g performed by the neural network, the requirements for f are:
- f(359) to be near f(001),
- so that g(f(359), f2(x2),... fn(xn)) may be near g(f(001), f2(x2),... fn(xn)),
- f must be differentiable (which makes back-propagation work better), and
- f must be injective (with the exception of x=0 and x=360).
My conclusion, which I want to be sure about before changing a mountain of code, is that no such f exists mapping R->R, but there exist simple mappings (f1,f2):R->R^2 that do meet the requirements.
The simplest solution that came to mind is to represent the angle as two separate inputs: f1 : [0-360]->[-1 ,1] by sin(2pi x/360) and f2:[0-360]->[-1,1] by cos(2pi x/360). Therefore a linear-ish network function g would give a result g(f1(359),f2(359),f3(x3),... fn(xn)) that is near g(f1(001),f2(001),f3(x3),... fn(xn)).
So, before I change the mountain of code, am I missing anything?
Edit 1: I think the non-existence of a suitable f:R->R boils down to Rolle's Theorem. If f(0) = f(360) and f is differentiable, then there exists a point c in (0,360) with a horizontal tangent, hence f is not injective, hence no such function exists from [0,360]->[-1,1].
Edit 2: Just to put a lid on this, after the change, (1) the neural network appeared to converge much more quickly, and (2) irregularities in the output for reddish colors (hue between 330 and 030) went away.