I believe I’m seeing a new issue with the Offset Curve component in Grasshopper in R8. I don’t have an older copy to test against, just don’t believe I saw such long run times last month.
In R8 I’m getting 4+ min to complete this offset of 96 distances:
Now, only 21 of those 96 are valid offsets for this closed curve - as the rest are too far. However I’m using that behavior in my application. I can’t test for what is a valid set of offsets before hand - but using the results are handy.
However it will produce a bunch of invalid results past the min size - which is why I can only deliver my app in R8 - this offset behavior in R7 breaks code further down the line.
I have played a little with the Clipper2 plugin for GH - but this has its own issues - however it can do these offsets in around half a second:
Can the dev who has been working on the new R8 offset code in the API have a look and see if there is something that can be done to speed this up? Sample code attached.
I guess McNeel tried to make the algorithms more bulletproof. But this comes with computational cost. So how can you solve the issue for now?
From my experiences in class-A modelling. You should never ever try to directly offset fillets or blends. In practice this means to only offset the theoretical curve/surface. Once you did this, you may need to refit first. The overall aim is to reduce as many redundant NURBs properties without loosing to much deviation. Once you have a clean theory, you should then fillet or blend. I can not emphasise how the order of operation matters. Fillets are more of a post-process.
Blending works well if you apply the same properties (relative spacing). You might loose perfect offset, but it will yield much better geometry. This process is quite easy to apply in Rhino, but becomes quite challenging in Grasshopper. But if doable for your use-case, you get very fast computation results.
Clipper is often pointless at this point due to its polygonal limitation, but if you thread a poly-line as basis for a fitting operation, you could incrementally offset and fit. I think it will yield more robust results, but comes with its own performance penalty.
Yeah, in this situation it is offsets for 2D pocketing in KaroroCAM, which on some recent files now have a run time of “life of the universe”.
Clipper looks like a really good solution, I’m now working my way thru updating my code to use it. I was hoping to stick with a native Rhino solution, but speed might win out in this case.
I’m taking a look currently. I have an open issue here about the slow runtime/hang at high tolerances. I will add your example to it.
It’s worth remembering Clipper only ingests and outputs polylines.
I don’t see the sense in checking every value between 21 - 96. If you have decided as 96 as your upper limit, presumably from the bounding box or some other heuristic, you can use a binary search to figure out what the real limit with a lot fewer offsets. So you start with your bounds at 1 and 96. You get an offset at 1 and not at 96, so then you check the middle 48 and it has no offset. Your new bounds is between 1 and 48. Then you keep checking the middle until you find the limit.
Thank you for getting back to me and for your very useful suggestions.
I’ll keep my “native GH” code ready for an update on that open issue. I really only need polylines in this application, I was avoiding Clipper for the extra plugin needed + some code jumping around to get the output to the right plane etc.
However, last night I got thru all of those and the result is lighting quick - the benefits might outweigh the issues.