I am given three pairs of (pressure $p$, molar volume $V_m$) measurements at constant known temperature $T$: (0.750000, 29.8649), (0.500000, 44.8090), (0.250000, 89.6384). The goal is to determine the ideal gas constant $R$ plotting $p$ vs $1/V_m$ or $V_m$ vs $1/p$ and using the ideal gas law $pV_m = RT$. However, these plots produce best fit lines with slightly different slopes. I used Microsoft Excel to do this. The resulting equations of the best fit lines and $R^2$ values are: $$y = 22.3932159243x + 0.0002057994 \qquad R^2 = 0.9999999746$$ for the $p$ vs $1/V_m$ plot, and $$y = 22.4149788462x + 0.0214038462 \qquad R^2 = 0.9999999998$$ for the $V_m$ vs $1/p$ plot.
I am completely confused by this and would appreciate if someone could shed some light on this "paradox".
This is perfectly normal in the least square sense.
Suppose that the data are $(x,y)$. If you fit $y=a + b x$ you minimize the vertical distance on $y$ and vice-versa. So, except if there is absolutely no noise, the results of the regression cannot be the same.
In fact, if I had this problem, what I should do is to write $$P\,V=c$$ and minimize with respect to $c$ the real sum of squares $$SSQ=\sum_{i=1}^n (P_i\, V_i -c)^2$$ which will just give $$c=\frac 1 n \sum_{i=1}^n P_i\, V_i$$ With your data $c=\frac{2688511}{120000}=22.4043$ which is neither $22.3932$ nor $22.4150$ but something in the middle.