Why we use an orthogonal polynomial (Hermite, Legendre, or Laguerre, etc.) approximation of any function if Taylor series approximation is already there. And what are the criteria to say that which approximation methods are better?
2026-03-25 06:32:43.1774420363
Orthogonal polynomial approximation
776 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in POLYNOMIALS
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- Integral Domain and Degree of Polynomials in $R[X]$
- Can $P^3 - Q^2$ have degree 1?
- System of equations with different exponents
- Can we find integers $x$ and $y$ such that $f,g,h$ are strictely positive integers
- Dividing a polynomial
- polynomial remainder theorem proof, is it legit?
- Polyomial function over ring GF(3)
- If $P$ is a prime ideal of $R[x;\delta]$ such as $P\cap R=\{0\}$, is $P(Q[x;\delta])$ also prime?
- $x^{2}(x−1)^{2}(x^2+1)+y^2$ is irreducible over $\mathbb{C}[x,y].$
Related Questions in TAYLOR-EXPANSION
- Mc Laurin and his derivative.
- Maclaurin polynomial estimating $\sin 15°$
- why can we expand an expandable function for infinite?
- Solving a limit of $\frac{\ln(x)}{x-1}$ with taylor expansion
- How to I find the Taylor series of $\ln {\frac{|1-x|}{1+x^2}}$?
- Proving the binomial series for all real (complex) n using Taylor series
- Taylor series of multivariable functions problem
- Taylor series of $\frac{\cosh(t)-1}{\sinh(t)}$
- The dimension of formal series modulo $\sin(x)$
- Finding Sum of First Terms
Related Questions in APPROXIMATION
- Does approximation usually exclude equality?
- Approximate spline equation with Wolfram Mathematica
- Solving Equation with Euler's Number
- Approximate derivative in midpoint rule error with notation of Big O
- An inequality involving $\int_0^{\frac{\pi}{2}}\sqrt{\sin x}\:dx $
- On the rate of convergence of the central limit theorem
- Is there any exponential function that can approximate $\frac{1}{x}$?
- Gamma distribution to normal approximation
- Product and Quotient Rule proof using linearisation
- Best approximation of a function out of a closed subset
Related Questions in ORTHOGONAL-POLYNOMIALS
- Is there something like "associated" Chebyshev polynomials?
- What is the difference between Orthogonal collocation and Weighted Residual Methods
- Calculate Stieltjes Polynomial
- How do I show this :$\int_{-\infty}^{+\infty} x^n 2\cosh( x)e^{-x^2}=0$ if it is true with $n$ odd positive integer?
- Gegenbauer functions and applications (esp. circular envelope special case)?
- Calculating coefficient of approximation polynomial which is expanded in to a series of Legendre Polynomials
- If $P_n(1)=1$ calculate $P'_n(1)$ in Legendre polynomials
- Linear Functional and Orthogonal polynomial sequence relation
- Show that if $\{P_n\}$,$ n\geq0$ is orthogonal with respect to a linear functional $L$ then the following two are equivalent.
- Orthogonality and norm of Hermite polynomials
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
When we study inner product spaces, we usually aim at creating an orthonormal (or orthogonal) basis for a vector space. The main advantage of doing so is that finding the coefficients to form any vector using this basis becomes really easy.
If we have a vector space $V$ with an inner product $\langle \cdot, \cdot \rangle$, then we can always have an orthonormal basis $B$. When we say that $B$ is an orthonormal basis, we automatically mean that for $i \neq j$, $\langle v_i, v_j \rangle = 0$ for basis vectors $v_i$ and $v_j$; and that $\langle v_i, v_i \rangle = 1$.
Hence, if we want to write any vector $v \in V$ as a linear combination of basis vectors, we need to find $\alpha_1, \alpha_2, \cdots, \alpha_n \in \mathbb{F}$ such that
$$v = \alpha_1 v_1 + \alpha_2 v_2 + \cdots + \alpha_n v_n$$
for basis vectors $v_1, v_2, \cdots, v_n$. Then, to find the coefficients, all we have to do is
$$\langle v, v_i \rangle = \alpha_i$$
for $i = 1, 2, \cdots, n$. Since $\langle \cdot, \cdot, \rangle$ is well-defined known function, all the coefficients are automatically known.
This ease is not obtained if we do not have an orthonormal basis. Therefore, even for approximating functions, we use the orthonormal polynomials rather than the usual polynomial basis used in Taylor's expansion. However, since we are approximating, there are a few approximation conditions that need to be added (such as the approximations should pass through certain points, or the error should be minimized, etc.).