Non-linear function with properties related to Gaussian distribution

88 Views Asked by At

Let $\mathcal{A}_n$ be the set of functions $f : \mathbb{R} \to \mathbb{R}^n$ such that:

  • $f$ is continuous and $f$ is differentiable almost everywhere
  • if $X \sim \mathcal{N}(0, 1)$, then:
    • $E[f(X)] = \mathbf{0}$
    • $E[{\lVert f(X) \rVert}^2] = 1$
    • $E[{\lVert {f'}(X) \rVert}^2] = 1$

(where $\lVert \cdot \rVert$ is the euclidian distance on $\mathbb{R}^n$)

Could you find a non-linear function that belongs to $\mathcal{A}_n$, or prove the following statement:

$$\forall n \in \mathbb{N^*}, \forall f \in \mathcal{A}_n, f \, \text{is linear}$$

After doing some experiments, I'm almost convinced that the last statement is true, but I have no idea how to prove it. If, on the contrary, $A_n$ contains non-linear functions, then these could be used as (multi-dimensional) activation functions in neural networks.

1

There are 1 best solutions below

1
On BEST ANSWER

Edit: Thanks for pointing out the mistake, I have corrected it.

No such function exists. I will consider first the case $n=1$. Because $E f(X)^2=1$ we can expand $f$ as a (possibly infinite) sum of Hermite polynomials $f(x)=\sum_{i=1} c_i He_i(x)$ (we don't include the constant term $He_0(x)=1$ because $Ef(X)=0$). The crucial property we will need is that these polynomials are orthogonal wrt to the Gaussian density, more precisely $$EHe_i(X)He_j(X)=\delta_{ij} i!$$

In particular,

$$Ef^2=\sum_{i=1} c_i^2 i!$$

Furthermore, we will use the property $$He_i'(x)=iHe_{i-1}(x)$$ From this, we have $f'(x)=\sum_i c_iiHe_{i-1}(x)$ so

$$Ef'^2=\sum_{i=1} c_i^2 i^2 (i-1)!=\sum_{i=1} c_i^2ii!$$

Now, the coefficient of $c_i^2$ in this sum is $i i!$, which is strictly larger than the coefficient of $i!$ in $Ef^2$, unless $i=1$. Since the coefficients are all non-negative, the only way these sums could be equal is if $c_i=0$ for $i>1$. Thus $f(x)$ must be of the form $f(x)=c_1 He_1(x)=c_1x$.

For higher dimensions, we can write $f(x)=(f_1(x),\dots, f_n(x))$. By applying the above argument to each component, we conclude that $Ef_i^2(x)\leq Ef_i'^2(x)$ with equality if and only if $f_i$ is linear. Therefore, if $E|f|^2=E|f'|^2$, then each component $f_i$ must be linear.