In chromatography, the signal is shaped like a Gaussian peak, and it is plotted against time vs. instrument's signal. https://en.wikipedia.org/wiki/Chromatography#/media/File:Rt_5_12.png
Background: One of the ways to measure chromatographic efficiency is to calculate "number of theoretical plates" which is just (time of peak maximum)^2/ (variance of the peak in time units). It is a dimensionless number which essentially shows how narrow the peak is. I have searched the work of Martin & Synge (Nobel Prize) for they came up with the theoretical plate concept, however, nowhere they used this equation. However, every modern textbook, says the number of theoretical plates is defined as (peak maximum)^2/ (variance of the peak). One can use widths of the peak instead of variance, assuming the peak is Gaussian. No logic is provided as to why this measure was invented. https://www.shimadzu.com/an/hplc/support/lib/lctalk/theoretical_plate.html
Is there any mathematical analog as to why one would divide the (mean)^2 by its variance or more rigorously, is there any physical interpretation of first-moment squared divided by the variance of the distribution.
Thank you.