Joining Breps together(Grasshopper/Rhino5) (63.9 KB)
I have a problem of joining breps.
I made a geometry of a chair with several breps/surfaces and joining them i got 41 naked edges. Some of them lie even inside my surfaces and breps, not on the edge.
Before final join if i take brep edges from the first surface from joining i see 200+ naked edges… why? WHat am i doing wrong?

How to fight with naked edges in grasshopper? Tolerances? I tried, but it didnt help…
When does grasshopper take tolerance from rhino document? When you open a file? Cuz if i change tolerance in normal way - nothing happens. If i close Rhino and change from 0.001 to 0.0001 - my open joins falling apart.
P.S: I tried cap holes+solid union - didnt work out.
I put a file into attachement, sry for not internalizing, but i dont know what tointernalize cuz proably everything can matter. i hope somebody can help me because i struggle with this problem for really a lot of time

HI Mesmer,
The GH definition is beautiful. But, I think the issues are with your modeling process, not your Grasshopper definition. I would recommend first to build a good closed polysurface in Rhino, with out Grasshopper. Then work to engineer the same process in Grasshopper.

The geometry will never be able to be closed by Grasshopper, because the geometry will not be able to be closed in Rhino without a lot of manual work… Also the isocurves are unnecessarily dense. The surfaces are un-necessarily dense. There were 566 nakes edges after baking the BrepJoin.
I was able to cap some in Rhino, after which I had 323 open.

All of there 323 were not planar, so Cap will not work.
Patch is a good option for non-planar openings, but it is not an auto-magic thing, and each location would need to be handled.
I think it is best to engineer out the openings, upstream and hopefully avoid these all together…

It will we fantastic when it is done.
Mary Ann Fugier

Whenever a new solution is computed. So if you change the Rhino tolerances you must run a full new solution (F5) in Grasshopper before it takes affect everywhere.

Will read more about naked edges in Rhino and try this way.Thank you Mary and David for your answer!

Two days in a row now, this happened - annoying!

It happened yesterday trying to view and work with FIA in this thread:

Rhino5 32-bit? I.e. is there a reason to believe the error is mistaken?

Yes. I have no idea if the error is “mistaken” or not but it is very rude and doesn’t happen immediately. In the other thread, it happened after I had reduced the initial parameters from “20 X 130” to “20 X 30”, and didn’t happen immediately there either. What seemed to trigger it in both cases was just rotating the full screen ‘Perspective’ viewport.

The windows Task Manager should have a display or at least list the amount of memory taken up per process. If this amount is nudging 2GB, then you are running out of memory.

I re-booted (a long process) and before doing anything else, tried to start Rhino 5, 32-bit (because 64-bit often behaves poorly for me). I waited a very long time with no indication that Rhino was starting up; so long that I wondered if I mis-clicked the desktop icon, so double-clicked it again. After another very long wait, I get a deadlock condition with this message:


So I started to compose this reply and eventually the deadlock became unstuck in the background and two copies of Rhino started, delayed by roughly ten minutes!? All kind of ridiculous, don’t you think? I know that wacky use of graft can cause HUGE numbers of unwanted geometry but the model in the other thread had only 600 subsurfaces (20 X 30) . Rhino seems to hog CPU resources in ways that are inconsiderate and oblivious, even when it’s the only app running (other than the usual Windows bloat of background tasks).


P.S. OK, after more than an hour of recovery time, I was able to re-open the file from @ArtemVolkov in this thread while watching Task Manager memory use climb slowly and steadily from a baseline of ~200 MB of memory all the way up to more than 1,700 MB!!?? Crazy.

Either the file is genuinely large, or there’s a mistake causing it to compute and store more data than it needs to, or there’s a memory leak, causing perfectly reusable memory to become clogged with gunk.

Chrome on my machine takes up 6GB and I have an extension installed which unloads inactive pages, well beyond 1.7GB.

Rhino6 starts out at ~300MB and ends up with ~600MB when I open Artem’s file. So it sounds like there might be a memory leak in R5 which has been fixed. I noticed no shaded surfaces are visible, the meshing of lots of complicated surfaces may also take a significant amount of memory and time.

Tottally forgot to share it here. Thanks for the help)