Adaptive knot selection for B-spline fitting.

123 Views Asked by At

When fitting a B-spline for regression purposes I've seen a lot of cases where knots are fixed uniformly ,but in some situations this could lead to poor estimations because the behaviour of the curve is not uniform. Knots should be denser when function changes rapidly to capture those "high frequency" moves. I've read some papers that propose different methods to fit adaptive knots, by pruning knots , by fitting multi-resolution basis, etc. My idea (it's just that, an idea) is to use the short time Fourier transform to determine the intervals where higher frequencies are present, and hence to fix denser knots in these, and on the other hand to figure out where the low frequencies are more important and hence to fix more sparse knots. Is this theoretically correct? Maybe it's already been done , but honestly I didn't find anything online. Any hint or suggestions will be greatly appreciated.

1

There are 1 best solutions below

1
On

If I properly remember, around $1978$, Carl de Boor published a book "A Practical Guide to Splines" in which is described the optimal knot sequence for splines. This was implemented in subroutine $BSOPK$ in $IMSL$ which I used in the past.

On the other side, I just found this document "Spline Regression with Automatic Knot Selection" which appeared in $2018$.

I hope and wish this could be of some help to you.