If $x, y$ are two mathematical objects/variables, and $f$ is some function, then by “adding in quadrature” I mean that $(f(x+y))^2 = (f(x))^2 + (f(y))^2$.
For example, we have the Pythagorean Theorem for inner product spaces: for orthogonal vectors $u, v$, we have $$\lVert {u + v} \rVert^2 = \lVert {u} \rVert^2 + \lVert {v} \rVert^2.$$
For two independent random variables $X, Y$, the variance of the sum is the sum of variances: $$\sigma_{X+Y}^2 = \sigma_{X}^2 + \sigma_{Y}^2.$$
Also, I believe that when errors are uncorrelated, we can add uncertainties in quadrature.
My question is: In general, when might we have $$(f(x+y))^2 = (f(x))^2 + (f(y))^2?$$ Why is this important? In all the examples above, there was some notion of independence: orthogonality, independent variables, and uncorrelatedness all have some interpretation as to how things are “independent”.
My attempt at answering it might be: “If the square of $f(x+y)$ can be written only in terms of the original function values $f(x)$ and $f(y)$, this indicates some independence among the variables.” But, why might it be related to squaring? Another thing I found interesting is that, if instead of squaring we just considered powers of 1, then we get $f(x + y) = f(x) + f(y)$, which is just additivity. Is there something about squaring that is sort of like a “second-order” condition? (Whereas additivity might be the “first-order” condition?) Perhaps analogous to something like the first-derivative test and second-derivative test? Is it an interesting question to ask more generally whether $(f(x+y))^n = (f(x))^n + (f(y))^n$, or some other similar question?