y = brightness in percentage
x = apparent magnitude
What is a formula that would result in the graph below, with y representing the brightness in percentage and x the apparent magnitude (also accounting for negative values)?
To clarify: Each decrease by 1 in apparent magnitude will require the star to have only 100/fifth root of 100 of the illumination as it used to be


The definition of magnitude in astronomy is that an increase of $1$ in magnitude is a factor $2.5$ reduction in brightness. So if $m$ is the magnitude, we have $$y=100\cdot 2.5^{1-x}$$