Suppose I have several records of good ($x$) and bad ($y$).
Each good represents one and each bad represents one too.
The sample size is calculated by $n = x+y$.
In order to calculate the quotient of good and bad I apply this formula:
$(x-y)/n$ that return a value in $[-1, 1]$.
How would I convert this value to a average rate from zero to hundred supposing that zero is the most close to $-1$ and hundred the most close to $1$?