Orthogonal polynomials for regression

142 Views Asked by At

Is it possible to define orthogonal polynomials on the interval $[0, +\infty[$ ? Maybe using the Gram-Schmidt process from the monomial basis $(1, x, x^2, ...)$?

My problem is that I have some data for which I defined the polynomial model $f(x) = c_0+c_1 x + c_2 x^2 + c_3 x^3$. I use afterwards the parameters $c_0, c_1, c_2, c_3$, I estimate, to describe my data (regression). I found that there are some correlations between these coefficients so I wondered if there is any transformation that can give me another (orthogonal) basis $\phi_k(x), k = 0, \dots, 3$ such that $f(x) = d_0\phi_0(x)+d_1\phi_1(x)+d_2\phi_2(x)+d_3\phi_3(x)$ that guaranties the independence of parameters $d_0, d_1, d_2, d_3$.

1

There are 1 best solutions below

0
On BEST ANSWER

Laguerre polynomials $L_n$ are orthogonal with respect to the inner product $\langle f,g\rangle:=\int_0^\infty f(x)g(x)e^{-x}dx$ so you have $\langle L_n,L_m\rangle=\delta_{n,m}$