I wanted to make sure whether the following approximation is possible.
I have a vector ${\bf{x}} \in \mathbb{R}^N$ where the elements of the vector $\bf{x}$ are random variables. There is a matrix $L \in {\mathbb{R}}^{N \times N}$ so that the vector $\bf{x}$ is transformed to another vector $y = L\bf{x}$.
Now I would like to make sure whether the following matrix $K$ could possibly exist, when a nonlinear "element-wise" mapping function $h(\cdot)$ is applied for each element of the vector $\bf{x}$ (i.e., $h({\bf{x}}) \in \mathbb{R}^N$ ):
$z = L \cdot h({\bf{x}}) \approx K \cdot \bf{x}$.
I consider $h(\cdot)$ as a bounded function (e.g., $tanh(\cdot)$) and $L$ as a Laplacian matrix for now.
If an approximation doesn't make sense, can anyone help me find a bound to compare $L$ and $K$?
Thank you in advance.
It's a common problem to approximate a nonlinear map, such as $L\circ h$ here, by a linear one. Such problems come in two flavors:
You want approximation to be accurate for $x\approx x_0$. Then you use the derivative of $h$ at $x_0$ to build a linear approximation.
You want approximation to be accurate on some set. Suppose the set is finite (we can take a finite subset that captures its geometry well enough). Then you can fit a linear map to $Lh$ by minimizing the sum $$\sum_i \|Lh(\mathbf x_i) - K\mathbf x_i\|^2\tag1$$ over all matrices $K$. This is a standard Least squares problem.
As for how accurate the fit will be: in case 1 there are remainder estimates in Taylor's theorem; in case 2 the sum in (1) gives a general idea (especially if you divide by the number of points and take square root).