How to determine set of functions including piecewise functions are linearly independent?

323 Views Asked by At

While working on trying to understand why $$ \begin{align*} p_1(x)&=1\\ p_2(x)&=x\\ p_3(x)&=x^2\\ p_4(x)&=x^3\\ p_5(x)&=[x-\zeta_1]_+^3\\ p_6(x)&=[x-\zeta_2]_+^3 \end{align*} $$

provide a basis for the cubic spline with two knots: $\zeta_1$ and $\zeta_2$, as discussed in $5.1$ of The Elements of Statistical Learning [question $5.1$], it was presented that the functions cannot be linearly combined to equal the zero function. Where $$p_5(x)=[x-\zeta_1]_+^3=\begin{cases} (x-\zeta_1)^3, & x-\zeta_1> 0 \\ 0, & x-\zeta_1\leq0 \end{cases} $$ and $p_6(x)$ is the same respectively.

From other posts (I know up until this point, it's essentially the same as this) I would think that you can show the first four are linearly independent by considering their Wronskian:

$$ \begin{vmatrix} 1 & x & x^2 & x^3 \\ 0 & 1 & 2x & 3x^2 \\ 0 & 0 & 2 & 6x \\ 0 & 0 & 0 & 6 \end{vmatrix} $$

but that seems more complicated (could be wrong) than just observing that the derivatives in the Wronskian could be expressed:

$$ \begin{align*} f(x) &= c_1 + c_2x + c_3x^2 + c_4x^3\\ f'(x) &= c_2 + 2c_3x + 3c_4x^2\\ f''(x) &= 2c_3 + 6c_4x \\ f^3(x) &= 6c_4. \end{align*} $$

If those are set to equal $0$ as required by the definition of linear dependent functions (that they can be linearly combined to equal zero), I would deduce that when $x=0$ in each case, it shows that the only way for the linear combination to equal zero is for $c_1=c_2=c_3=c_4=0$, but is it appropriate to use those derivatives to deduce that (maybe I'm just not thinking of the requisite theorem or property- I tried searching "derivatives" and "linear independence" without finding the specific problem I'm having)?

Once that's sorted out, I really don't know how to approach $p_5(x)$ and $p_6(x)$. Working in a similar fashion to above and considering the $4^{\text{th}}$ derivative of the linear combination: $$f(x) = c_1 + c_2x + c_3x^2 + c_4x^3 + c_5[x-\zeta_1]_+^3$$ doesn't seem right since $\frac{d^4f}{dx^4}=0$ and since a nonzero polynomial of degree $n$ has at most $n$ distinct roots. I saw the third comment here which seems relevant to this case, but I am unsure how to implement it. Thanks in advance for all/any help and clarification.

1

There are 1 best solutions below

0
On BEST ANSWER

Assuming $\zeta_1 < \zeta_2$, otherwise we swap $p_5$ and $p_6$.

We are checking, if we can express $p_6$ as linear combination of the other $p_k$: $$ p_6(x) = \sum_{k=1}^5 c_k p_k(x) \quad (x \in \mathbb{R}) \quad (*) $$

Case 1: For $x \le \zeta_1 < \zeta_2$ equation $(*)$ turns into: $$ p_6(x) = 0 = c_1 + c_2 x + c_3 x^2 + c_4 x^3 $$ We need $(*)$ to vanish for the infinite many $x \le \zeta_1$, while the RHS is a polynomial of utmost degree three, which is either the $0$ null function (for $c=(c_1,c_2,c_3,c_4) = 0$) or has not more than three real roots (fundamental theorem of algebra). So $c=0$ follows.

Case 2: For $x \in (\zeta_1, \zeta_2]$ we have $(*)$ as: \begin{align} p_6(x) = 0 &= c_5 (x-\zeta_1)^3 + \sum_{k=1}^4 c_k x^{k-1} \\ &= c_5 (x-\zeta_1)^3 \\ &= c_5 x^3 - 3\zeta_1 c_5 x^2 + 3\zeta_1^2 c_5 x - \zeta_1^3 c_5 \end{align} Again we need infinite many roots from the RHS, which by a similar argument like above means all coefficients of the $x^k$ need to vanish, which means we have to choose $c_5 = 0$ too to satisfy this case.

Case 3: For $x > \zeta_2$ we then have $(*)$ as: \begin{align} p_6(x) = (x-\zeta_2)^3 &= c_5 (x-\zeta_1)^3 + \sum_{k=1}^4 c_k x^{k-1} \\ &= 0\\ \end{align} which is not true for $x \ne \zeta_2$.

We see that is not possible to choose $c_k$ such that $(*)$ holds for all $x \in \mathbb{R}$. Therefore the $p_k$ are linear independent.