I know that there is a way to approximate a continuous function f(x) with orthogonal polynomials. but what if we have some discrete points instead of f(x)? is there any way or algorithm to approximate them with orthogonal functions in the least square manner?
2026-03-25 19:03:06.1774465386
On
Approximate Discrete Set Of Points With Orthogonal Functions
318 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
0
On
Yes, you can construct functions that are orthogonal under a discrete inner product $$<f,g>=\sum_i f(x_i)g(x_i)$$ In fact, the discrete fourier transform is an example of what you are asking: it approximates (evenly-spaced) discrete data as a sum of orthogonal (under a discrete inner product) functions.
The functions do not have to be orthogonal, but any basis functions of the space you want to span work. Maybe a good reference would be the literature on spline smoothing.
I assume you know basis functions that are used for the approximation. There is a large number of possibilities.
Assume you represent the function $f$ in terms of basis functions $h$: \begin{equation} f(x) = \sum_{i=1}^{N} \alpha_i h_i(x) = \alpha^T h(x) \end{equation}
In the least squares sense, you want to find a vector of coefficients $\alpha$ such that \begin{equation} \hat{\alpha} = \arg \min_\alpha \, \sum_{n=1}^{N} (y_i - \alpha^T h(x_i ))^2 \end{equation}
where $y_i$ are your observations at points $x_i$. This sort of problem can be written as a quadratic optimization problem, just like ordinary regression. In case you do not have constraints on $\alpha$, you get the solution \begin{equation} \hat{\alpha} = (H^T H)^{-1} H^T y \end{equation} where the $i$-th row of $H$ contains $h(x_i)$.
You also can add roughness penalties to this optimization problem, in the sense of ridge regression.