# Mathematical definition for Interpolation Algorithms

(Spencer) #1

Just questions/clarifications for how the Interpolate Data component works.

Here’s how I understand each algorithm although I’m not sure if they’re correct.

1. Block - looks like applying floor/ceiling
2. Linear - drawing lines along the data points, new points would lie inside the lines
3. Catmull - approaches a spline?, i.e. if my understanding of Catmull-Clark is correct.

I can’t get a feel for Cubic though. How does it work?

(David Rutten) #2

Block just takes the value of the nearest sample.
Linear interpolates between the nearest samples on either side according to x_t = (1-\hat{t})x_0 + \hat{t}x_1 where x_t is the interpolated value at t and \hat{t} is the normalised parameters between the adjacent samples x_0 and x_1.
Cubic relies on four samples (the nearest two on either side) and uses a polynomial instead of a linear expression.

#3

(Spencer) #4

Thanks!

(qythium) #5

Huh, I’ve never used the Interpolate Data component before, but the behavior of Cubic doesn’t seem to make any sense, either by the standard algorithm (as presented in Paul Bourke’s website) or @DavidRutten’s explanation

E.g. in the screenshot below, shouldn’t the interpolated interval between t_1 and t_2 be a cubic polynomial that passes through the four sample points circled in blue? It certainly doesn’t look that way though…

cubic-interpolation-oddity.gh (12.3 KB)

(David Rutten) #6

Yeah it seems to be off, I always use Catmull instead.

(qythium) #7

Ah, would that be considered a bug or is there some other underlying logic to the interpolation?

Also, does “Catmull” (I’m assuming Catmull-Rom spline, not Catmull-Clark subdivision) use uniform, centripetal or chordal parameterization? Just curious.

(David Rutten) #8

Both to be perfectly accurate. But changing it now would make files using the existing algorithm behave differently. I probably won’t change it for GH1 any more. Here are the interpolation schemes that will be available in GH2:

• Block: nearest value wins (same as in GH1)
• Linear: straight interpolation between adjacent values (same as GH1)
• Log-Linear: straight interpolation between adjacent values, but “straight” in a logarithmic sense. Only works on positive values.
• Cubic: standard cubic interpolation, but correct unlike in GH1
• Cubic Monotone: a cubic scheme which prevents any overshooting. If a sample is higher than both its immediate neighbours, the highest value on the interpolation spline is guaranteed to be at that sample.
• Akima: a cubic scheme with reduces overshoots.
• Equidistant Polynomial: fit a polynomial function to the samples, only works reasonably well if the samples are equi-spaced.
• Neville Polynomial: more robust polynomial fitter.
• Floater-Hormann: for a rational (as opposed to polynomial) function to the samples.
• Bulirsch-Stoer: an early rational fitter, often yields poles within the interpolation domain where the values on either side go to infinity.

There’s also a Fourier transform, which can be used as an interpolation scheme. It has an interesting feature that it is a cyclical interpolation, meaning the pattern repeats smoothly all along the x-axis. The samples are necessarily equi-spaced with this scheme.

I also added a Blend Curve interpolation which assumes a constant value before and after the first and last grips. It also supports various interpolation schemes including cosine, smooth, smoother (degree=5) and smoothest (degree=7):

Graph mapper: Can you do this?
(qythium) #9

Nice, looks like lots of thought went into this Looking forward to that first release of GH2!

(Michael Pryor) #10

Is this some kind of new graph editor?

(David Rutten) #11

Not finished yet, but yes, this is the Graph Mapper in GH2.

#12