I have an interesting problem that I need to solve. It's about classifying colors into green, blue, and the others. The word others is crucial here. I have three numbers between 0 and 255, denoted as $[R, G, B]$. I want to generate from them a probability vector $[x, y, z]$, where $x, y, z \geq 0$ and $x + y + z = 1$, satisfying certain natural conditions, supplemented with the conditions mentioned below to make it unique and naturally determined.
First, I require that when [*] $R=0$, then $x=0, y=\frac{G}{G+B}$, and z=$\frac{B}{G+B}$. If $G=B=0$, then $x=1, y=z=0$. If $R\geq \max(G,B)$, then $x=1, y=z=0$. For values $0<R<255$, both coordinates $y$ and $z$ should exponentially decrease with the increase of $R$, while $x$ should exponentially increase.
Furthermore, the formula for $G=B$ should be symmetric in $y$ and $z$, i.e., $[R, G, G]$ results in $[x, y, y] = [1-2\cdot y, y, y]$.
I hope that all conditions are not contradictory when given together. In that case, I might relax some conditions or use the random nature of $x, y, z$. If the model is not fully specified, I would leave it to a free parameter λ to determine.
The formula I would like to apply in a possibly simplest form is the $\operatorname{softmax}$ function, defined for a vector $[a, b, c]$ as follows: $x=\frac{e^a}{e^a+e^b+e^c}$, and similarly for $y$ and z. Here, $e^a$ is Euler's number raised to the power of $a$.
Or, can the sigmoid function be used ?
Now, it should create an analogous function, sophisticated_softmax, satisfying the above conditions using the ordinary softmax function, $\min(a, b), \min(b, c)|\cdot|$, etc., similarly for max and functions $\ln$, $e^{-a}$, and all possible combinations.
Could you handle this? I'm happy to provide answers to any questions, especially if they are related to fulfilling the above rules. To start, we can work with just 2 vectors $[x ,y]=[x, 1-x]$, i.e., with $[R , G]$, and then somehow extend it to 3 vectors $[x ,y, z]$, even though I currently don't know if that makes sense.
Simple normalization like $x=\frac{a}{a+b+c}$ cannot be used. It doesn't work. Moreover, the equation denoted as [*] may need a little tweaking, similar to the formula above. Maybe differential equations with initial conditions would be suitable, where the vector $[x, y, z]$ is a solution at some suitable point, for some appropriate differential equation, possibly both dependent on $[R, G, B]$.
Such a differential equation must be justified in a natural way from our starting point. Another goal should be to design a neural network that demonstrates this theoretical calculation on real data, although I'm not sure if it's beyond my abilities.
A bit of history on the problem: I encountered this problem while trying to teach a neural network to recognize blue-green images. However, I couldn't get it to classify a yellow $[255, 255, 0]$ or pure red (not blue or green) image with the vector $[1, 0, 0]$, and blue $[0, 0, 1]$ and similarly green $[0, 1, 0]$. It always correctly classified pure blue images mixed with green, but it classified purely red images as pure blue.
Thanks for reading this text and perhaps for suggesting some solution(s).