For an experiment, the standard deviation (SD) is entered as a parameter for generating gaussian noise. The SD value is mentioned to be 25 when data (a matrix) has integer values in [0,255]. If the data was scaled to be within [0,1], what would be the value of the SD parameter? Would it be [25/(max of data values)]?
2026-03-27 20:31:32.1774643492
Clarification: Expression to calculate the standard deviation when data is scaled to interval [0,1]?
77 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
To scale the data from $[0,255]$ to $[0,1]$, you use a scale factor of $1/255$. The standard deviation scales with the data, so the standard deviation of the scaled data will be $25/255$.