Let's say $f=X1+X2$ and I am trying to find the standard deviation (SD) of $f$. Through error propagation the SD of $f$ is $\sigma_f=\sqrt{\Delta X1^2+\Delta X2^2}$, where $\Delta X1=\sqrt{X1}$ and $\Delta X2=\sqrt{X2}$. So, $\sigma_f=\sqrt{{X1}+{X2}}$. Then, if I were to calculate the "standard deviation of $\sigma_f$" can I use the error propagation again? This means "standard deviation of $\sigma_f$" = $\Delta \sqrt{\Delta X1^2+\Delta X2^2}=\Delta\sqrt{X1+X2}=(1/2)\frac{\Delta (X1+ X2)}{\sqrt{X1+X2}}=(1/2)\frac{\sqrt{X1 +X2}}{\sqrt{X1+X2}}=(1/2)$.
Then my question is can I say that the SD of the SD of $f$ is $(1/2)$.