Credit scores are numbers between 350 and 850.
Lets say someone's score is 600
How can you determine what that score of 600 would be if it was measured from 0 to 100 rather than the range 350 to 850?
Interested to know what the relative differences are between scores which to my mind is easier represented in a 0 to 100 range than the intentionally obscure method that is the norm.

First subtract $350$, so that the number range is changed from $350-850$ to $0-500$.
Then divide by $5$, so that the range changes from $0-500$ to $0-100$.
So $600$ becomes $(600-350)/5=250/5=50$ on the scale of $0$ to $100$.
If you insist on using $1-100$ instead of $0-100$ then things get a bit uglier as you would have to subtract $350$, then multiply by $99/500$ and finally add $1$.