I recently incorrectly assumed that applying a non-linear operation on a completely uncorrelated sequence would yield an uncorrelated sequence. Turns out that it is trivially easy to show that this is not the case. Let $s_n \in \{-1,1\}$ be a random sequence of values with any distribution. $s_n^2 = 1$ for all $n$, which is not random.
Is there any mathematical theorem that points to the conditions under which a random sequence is made deterministic by application of a non-linear transform? There is dependence on the values taken by $s_n$ as to whether the resulting sequence after non-linearity preserves randomness. There is also some dependence on the non-linearity itself (i.e. $s_n^3$ is random in the example given).