I have a neural-network model in which each neuron is associated with an angle $\theta$. Firing rate as a function of $\theta$ is either a Gaussian or a constant.
The claim has been made using this network that the modulation ratio is the same in two different conditions. The modulation ratio is defined as
$$ \frac{f_A(\theta)}{f_B(\theta)}, $$
where $f_A$ and $f_B$ are two different firing rates across all $\theta$.
In condition 1, $f_A$ is a Gaussian, and $f_B$ is a constant.
In condition 2, $f_A$ is a Gaussian, and $f_B$ is also a Gaussian.
Is it possible that the modulation ratio is the same in these two conditions? If it is possible, what does this imply about the relative widths of the different Gaussians?