Detecting gaps between polysurfaces?

I’ve written a script as Rhinoscript and Pythonscript for Rhino to be able to detect duplicate polysurfaces, open polysurfaces and non-planar surfaces. I now need a foolproof method to be able to detect gaps between polysurfaces, so that the polysurfaces in the model form a contiguous medium. For lack of a better explanation, typically a model that I work with is comprised of irregular geometry that must be represented using convex closed solids that are composed of planar surfaces to form a fully contiguous model. Any gaps between “solids” i.e. polysurfaces will cause issues down the line. Without looping through each vertex in the model and comparing it to all of the other vertices, I can’t see an easy way of detecting these sorts of gaps easily. Merging vertices outright with a tolerance via meshing all of the breps independently would also cause issues, as I cannot guarantee all of the surfaces are planar after this operation. Granted I could align the vertices and then detect the non-planar locations, but this is a work-around. Given the intelligence and experience on these forums, can anyone think of a better solution for detecting these gaps easily in models that are composed of >5000 polysurfaces? A simple demo is attached, where there is a gap in the middle of a model - secondary versions are also available on different layers where there are only 3 polysurfaces. Fully matching polysurfaces are also available.TheCube_v1.3dm (1.2 MB)

Is it meaningful to define a typical gap distance (between edges involved in a gap), and testing if a distance from an edge to nearest edge is bigger than 0+tolerance but lesser than the “typical” gap distance, then those edges could be highlighted?

If so, then there’s a case which is close to this case (although dealing with open surfaces instead) which could be adapted to this case:

Adapting it to “Orphaned PolySurfaces” should be possible, and even if sometimes giving false positives, it’s only about indicating about candidates for fixing, so indication doesn’t have to be perfect (it would tend to indicate too many cases)

// Rolf

Are the gaps ‘guaranteed’ to be much smaller than the dimension of the surfaces, as in the example?

Do you always need world aligned boxes, as seems to be the case in the example file?

@RIL: Damn but I should have seen this - a particularly elegant solution. The only issue with this is my massive failure to describe the problem better: a completely valid case that can also occur is for a polysurface to lie on another much larger polysurface so that the bottom of the first polysurface is completely within the top of the underlying polysurface. In this case all of its vertices lie on the plane of that larger polysurface, but the edges do not match. I’ve attached a second example model if it helps at all. The help is much appreciated! Gapexample2.3dm (157.8 KB)

@nathancoatney: The gaps are generally always caused through user error so in general are much smaller than the polysurface dimensions, as they will be missed visually. In general probably in the range of mm to cm, but this depends on the model. The world-aligned example was just a quick example - models can contain polys that can have any alignment. Apologies for not describing the problem better!

I made a new version of the aforementioned component, which is now testing for edges located between a “gap interval” distance from any other brep face.

The gap interval distances can be specified with the component’s MinDistance & MaxDistance inputs, see Fig 1). If MinDistance is omitted, Rhino ModelAbsoluteTolerance is used as the default MinDistance.

Fig 1. In “Debug mode” the fat yellow edges with test points (fat black spheres) are displayed:

In the middle, the edges of the blue triangle (from the Brep group from the example file) was detected, and to the far right I had separated the upper breps by lifting them a bit, and so the bottom edges of those breps was indiceted as “gaps” as well.

Fig 2. Edges where gaps were detected

I hope I understood the criterias for detection correctly, but if not, let me know.

Find the OrphanedBrepEdges component included:

Find_Orphaned_BrepEdges.gh (53.4 KB)


EDIT: R1.1 Update
An updated version 1.1 which makes more extensive tests. In addition to testing from midpoints of edges, it now also test if end-points of edges has gaps to nearby Breps. To avoid false positives for pie-like peaky parts of brep-edges to cross over itself over the peak and over to a nearby brep, two tests at each edge end are peformed, one 2 x tolerance FROM the endpoint and one exactly at the endpoint, and if distance is 0, or withing ModelTolerance, the edge is not subject to any gap, regardless of if any other point on the edge (falsely) indicates a gap.

Find_Orphaned_Breps.gh (71.6 KB)

// Rolf

1 Like

Avoided false positives:

Due to the strategy to test five points along the edge (midpoint + each end, plus near each end), the test near the edge end on wedge-like peaky geometry tends to cross over itself (its own brep) over to a nearby brep (to the right) and give a false positive. See Fig 1.

Fig 1. This “near end” test is avoided due to the fact that the (exact) end-test is within tolerance (or within MinDistance):

In most cases the above case can be avoided by checking if the end point is within tolerance, and if so any further gap-test is simply aborted. But there are other tricky cases:

Examples of known false positives:

Fig 2. The yellow edge has a gap towards the Brep below (see yellow arrow) and that’s Ok, but because of this gap, the red edge think’s there’s a gap to the yellow edge! (see red arrow). That’s a false positive:

// Rolf

Sorry Rolf, was a bit busy over the weekend installing some furniture. The solution you came up with is particularly elegant and left me speechless - I was stunned that you came up with it so quickly! I can’t thank you enough - I’ll have to go give it a good test with actual models that contain over 5k or more Breps. As I haven’t used GH much I have a very elementary question: the list of breps when updated by selecting polys in a document updates to a list of referenced breps, which results in the script not executing. The version you sent over is hard-coded to closed breps - how do I apply it to other models?

Very nice solution Rolf! Thank you for this.

That sounds intereseting :wink:

Let me know if the test can be further improved. I didn’t focus on performance in this solution so it will be interesting to see what happens with 5K breps… :face_with_head_bandage:

// Rolf

Hey Rolf, I can’t seem to get the updated version to work even with the examples I sent - the original version works brilliantly even if false positives are more common. What am I doing wrong with the updated version given I’ve played with the tolerances a fair bit? I haven’t had a look at the C# source, but assume the problem stems from it as the results in the grasshopper are invalid when using the updated version.

Could it be that the MinDistance is too big? Does it work if you detach any input from MinDistance and let it work with the default? Can you post a screenshot of some invalid results (if getting any detections at all)?

Be aware of that any detection happens only within a gap bigger than MinDistance and smaller than MaxDistance, that is, inbetween the values like so:

|--|<--gap-->|-------
   ^         ^
  Min       Max

So, if you have only small glitches, the MinDistance must be very small as well, which is why I found that 2 x ModelAbsoluteTolerance is a sensible default MinDistance value.

Can your problem stem from changing the document tolerance to a bigger value than the tolerance was when the objects were drawn?

MaxDistance shouldn’t be any problem, since increasing the value would include more objects farther away (false positives) so an exceedingly big value would immediately become obvious.

// Rolf

Hey Rolf, sorry for the trouble - it was down to the settings in the end. The updated version took about 10-15 mins to run on 1426 polysurfaces, but far less false positives. Fantastic work, I can’t thank you enough!

As a follow-up - the attached looks like a false positive, is there any way of catching this easily?Gapexample3.3dm (241.5 KB)

Looks like the detection is between one of the edges of the “slices” and one of the the upper Breps. This is unfortunately what can happen when an edge is within the interval when distance is the only criteria…):

Zooming in the line reveals that the detection found a “gap” to the upper Brep:

image

MORE DETAILS
I added some more features so you can examine in more detail what is actually being detected.

The existing output DEBUG_TestPoints is useful for close examination of detected gaps, and also the new tiny gap lines provided via the new output DEBUG_AllGaps which are drawn between the detection points (these small lines may contain duplicates, but it costs processing power to prune them so I didn’t bother)

If you Bake these points and/or gap lines into a dedicated layer you can select them (all on) that layer and zoom in on them for closer examination (see the red circle area).

NEW INPUT
I also added a new input PeakOffset with which you can scale the “near end” tests which gave the false positive in this case. The Default value is 1.5 (read the description of the Input, I added some text there too) and if scaling the peak offset distance to ~2.0 the test point will move farther from the endpoints, and the false positive goes away (in this case at least).

Notice that manipulating away a false positive in one case may not give the same effect in other cases. But in any case, if you carefully examine where, and how (directions), and what distance you get on the detections, you can better fine-tune the Min, Max and PeakOffsets to values that work better for you.

I added a DEBUG_MinDistance output as well, so you can see what the default value is if not connecting any input value to MinDistance.

At last, I also added some Hide toggle buttons so you can hide the clumsy yellow lines and black dots while you examine in more detail the points and new gap-lines as described above.

MORE/SMARTER TESTS
At the moment I couldn’t come up with any new strategy to avoid these false positives in any simple generic way. With strategy I really mean any geometric test which also you could come up with. If you do, perhaps I can implement it with code.

Please find attached the new version of the component (R1.2):
Find_Orphaned_Breps_1.2.gh (45.5 KB)

// Rolf