Use cubic instead of linear interpolation in predict#63
Conversation
|
The first test failure here is a bit surprising. For some reason the KD tree ends up not using the x values in Lines 21 to 24 in 31d924a julia> model.verts
Dict{Vector{Float64}, Int64} with 6 entries:
[14.0] => 1
[14.35] => 2
[13.0] => 3
[16.0] => 4
[13.5] => 5
[15.5] => 6which means that there is no longer a perfect match for the |
|
I think I've finally managed to figure out how to compute the splits, see #64. It was pretty tedious to reverse engineer. When that one is in, this PR can be updated and the broken test I added in #64 should then pass. I've also realized that with this PR, we shouldn't store the polynomial coefficients from the local regressions. Instead, we should just store the predictions and the derivatives of the predictions at all the vertices. That is all we need to construct the cubic interpolations. It took me way too long to realize that these quantities are what R stores in the |
Codecov ReportPatch coverage:
Additional details and impacted files@@ Coverage Diff @@
## master #63 +/- ##
==========================================
+ Coverage 92.11% 93.20% +1.08%
==========================================
Files 2 2
Lines 203 206 +3
==========================================
+ Hits 187 192 +5
+ Misses 16 14 -2
... and 1 file with indirect coverage changes Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. ☔ View full report in Codecov by Sentry. |
1b90daa to
1834385
Compare
|
I think this one is ready now. |
The cubic interpolation is what is suggested in Cleveland and Grosee (1991) and also what R uses. It makes the prediction function once differentiable which is what I think most people would expect from LOESS. Also, fix bug in use of partialsort!. Only the q'th element was ensured to be at the right localtion instead of all the first q elements. Use evalpoly from Julia and bump minimum requirement to Julia 1.6
|
@devmotion I reapplied the changes you suggested |

The cubic interpolation is what is suggested in Cleveland and Grosee (1991) and also what R uses. It makes the prediction function once differentiable which is what I think most people would expect from LOESS.
Also, fix bug in use of partialsort!. Only the q'th element was ensured to be at the right localtion instead of all the first q elements.
Closes #61