let $x$ be a set of measured values with uncertainty $\sigma_x$ and $y$ be a set of measured values with uncertainty $\sigma_y$ by the formula $\sigma_f^2=\sum_{i=1}^{N}(\dfrac{\partial f}{\partial x_i})^2\sigma_i^2$ we get that $\sigma_{x+y}=\sqrt{\sigma_x^2+\sigma_y^2}$.
Now my question is can't we derive the above without using that formula and just using the fact that uncertainty is just the standard deviation of the set of values? So from my logic $\sigma_{x+y}=\sqrt{\dfrac{1}{N}\sum_{i=1}^{N}[(x_i+y_i)-\overline{x+y}]^2}=\sigma_{x+y}=\sqrt{\sigma_x^2+\sigma_y^2}$ where N denotes the number of members in the sample and $\overline{x+y}$ denotes the mean of x+y and $\sigma_x /\sigma_y$ denote the standard deviation of the two samples . I tried to show that and I simply can't. Am I going wrong somewhere with my logic?